Skip to content

anomam/handwriting-pytorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Handwriting with Pytorch

alignment_01

This is my implementation attempt of this amazing paper from Alex Graves, using Pytorch.
So far I've only implemented the unconditional handwriting generation part, but I'll add the conditional generation one as soon as I can.

How to

Install dependencies

The dependencies can be installed using:

pip install -r requirements.txt

You might need to adjust the pytorch version to your local setup though (CUDA version, ...): see this Pytorch installation matrix for help.

See all the available commands

Simply use:

python cli.py --help

Train the LSTM network

Data download

The raw data needs to be downloaded from the official database into this folder: data/data_raw/

Prepare data

You can then prepare the data by running:

python cli.py prepare-data

It will convert the raw data into numpy arrays that can be easily loaded from the data/data_np_offsets/ directory.

Training

You can launch the training by running:

python cli.py train-generator

A number of parameters can be passed to this command; use the following for more details:

python cli.py train-generator --help

Training for 20 epochs led to good enough results. As an point of reference: 1 epoch takes around 20s on my GPU (using the default command parameters).
The saved model parameters as well as the in-training generated examples will be saved into the data/log/ folder.

Generate examples

Unconditional generation of examples can be done using:

python cli.py generate

The generated results will be saved into data/log/plots/

Unconditional generation examples

Generated examples with no conditioning on the text characters:

no_alignment

alignment_01

Resources

I would like to give a big acknowledgment to these awesome repositories that helped me a lot understand how to parse & transform the data, and the intricacies of the RNN architecture implementation with the negative log-likelihood loss function:

About

✏️ Handwriting generation with RNNs and Pytorch

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages