Skip to content

Latest commit

 

History

History
85 lines (59 loc) · 3.22 KB

README.md

File metadata and controls

85 lines (59 loc) · 3.22 KB

GPT2-Home 🏠

Hugging Face 🤗 model page

Make sure to check model page on Hugging Face

Live Demo

For testing model with special configuration, please visit Demo

Blog Post

For more detailed information about project development please refer to my blog post.

How to use

You can use this model directly with a pipeline for text generation:

Pre Requirements

pip install -U transformers

How to generate using pipeline

from transformers import pipeline

def  remove_repetitions(text):
	first_ocurrences = []
	for sentence in text.split("."):
		if sentence not  in first_ocurrences:
			first_ocurrences.append(sentence)
	return  '.'.join(first_ocurrences)

def  trim_last_sentence(text):
	return text[:text.rfind(".")+1]

def  clean_txt(text):
	return trim_last_sentence(remove_repetitions(text))

generator = pipeline('text-generation', 'HamidRezaAttar/gpt2-product-description-generator')

query = input("Please enter your text prompt: ")

generated_text = clean_txt(generator(query)[0]['generated_text'])

print(generated_text)

Run a quick Demo

Notebook
Test Outputs Open In Colab

How to fine-tune GPT-2

You can fine tune GPT-2 on any text-generation task using fastai library.

Notebook
fine-tune GPT-2 Open In Colab

Citation info

@misc{GPT2-Home,
  author = {HamidReza Fatollah Zadeh Attar},
  title = {GPT2-Home the English home product description generator},
  year = {2021},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/HamidRezaAttar/GPT2-Home}},
}

Dataset Reference

Image-based recommendations on styles and substitutes
J. McAuley, C. Targett, J. Shi, A. van den Hengel
SIGIR, 2015

Questions?

Post a Github issue on the Issues repo.