A whole GPT (Generative Pre-trained Transformer) language model in a single Jupyter Notebook, written in PyTorch.
Think of this as a small starter kit for your next NLP project. All you need to provide is a text file.
The model will be trained on this text, line by line, and will aim to imitate it.
-
Notifications
You must be signed in to change notification settings - Fork 0
MK2112/GPTemplate
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
A GPT in a Jupyter Notebook, written in PyTorch. Starter kit for your next NLP project.
Topics
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published