Skip to content

Latest commit

 

History

History
64 lines (43 loc) · 1.72 KB

README.md

File metadata and controls

64 lines (43 loc) · 1.72 KB

unsupervised-chatbot-GPT2

fine-tuning GPT-2 and Implement text generation chatbot This project aims to develop meorable and emotional chatbot using transfer learning (fine tune GPT-2 345M). You can find original code here.

It is never designed for commercial purposes.

Result

1

Install python library:

This project can be used regardless of tensorflow 1.x and tensorflow 2.x.

$ pip install tensorflow
$ pip install -r requirements.txt

Model install

  1. clik the link and download.
  2. Place the downloaded model in models\345M_org.

Usage

just run main.py

or

if you want to use your command line

$ python main.py

if you want to set hyperparameters

$ python main.py --top_k 10 --temperature 0.9 --nsamples 3

My dataset

My dataset is a .txt file (760 KB) of conversation between a bot and a user (my own file).

example is below

Author

Jungseob Lee / js-lee-AI / omanma1928@naver.com

Jungmu Park / boong_u / madogisa12@naver.com

Related papers

A Radford, et al., "Language Models are Unsupervised Multitask Learners", openAI blog, 2019.

A Vaswani, et al., "Attention is All you Need", NIPS 2017

Refrences

openAI

License

Modified MIT