Skip to content
/ toy-gpt Public

A toy Transformer model to demonstrate natural language generation capabilities on consumer hardware

Notifications You must be signed in to change notification settings

adey4/toy-gpt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Toy-GPT

Toy-GPT is a decoder-only transformer built from scratch using NumPy and PyTorch, trained to generate natural language similar to input.txt

Dependencies

  • Python3
  • PyTorch: conda install pytorch torchvision -c pytorch

Limitations

  • Training takes ~1 hr on Apple's M1 Pro Chip
  • Language generation quality is limited by compute

Example Output

Trained on Shakespearean text:

SICINIUS:
Is it strange?

Herald:
He's deceited, and children from his new spid
Then whomen he dares to him: were he worse.

BRUTUS:
You have pirtly not him.

MENENIUS:
What's the prisoner have not a silfa?

MONTAGUE:
O, and both shame, Menenius. Stanless, Thou art purpose;
And said thou pen for thy melting there,--

BENVOLIO:
Two sir, the earth proofs rids too come hither;
I thank you out, as thought sook for Ireland,

FRIAR LAURENCE:
His son, do your morself, that leaven your honours
Sufferable in more and suffer five.
A horse! High-graced York rights. And bother Montague

Sources

About

A toy Transformer model to demonstrate natural language generation capabilities on consumer hardware

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published