Skip to content

Example of an LSTM RNN network to generate random TV scripts

Notifications You must be signed in to change notification settings

samiron/tv-script-generator

Repository files navigation

tv-script-generator

Example of using LSTM RNN network to generate random TV scripts. In this project the Neural Network model will generate scripts for a popular TV series, Seinfeld. It gets trained from the scripts of season 9 and try to generate some randomd dialogues. Question is how much those random dialogues makes sense!

Notebook

dlnd_tv_script_generation.ipynb is the main notebook.

Required files

Glimpse at the generated script

Following is a part of script generated by model. The script is saved in generated_script_1.txt file. Here goes a snippet of that,

elaine:(leaving) hey.
jerry: what is that?
elaine: what happened to the last story?
george: i thought i could do this.
kramer: i think i wanted a little effeminate.
george: i have to say that.
george: i was convinced.
jerry: yeah, i thought we were gotten dating in the contest.
elaine: i was afraid i sent to say you on.
elaine: hey!
george: what?
george: i mean, i don't understand.(starts opening the sleeve employee, and the lopper skank, and i was convinced.

You see the generated script is not contextually perfect but the sentence constructions are not that bad.

Steps I followed

The steps were instructed by the Udacity course instructor. My responsibility was to implement the corresponding functions.

create_lookup_tables

Creates a numeric identifier of the vocabulary and returns two follwing maps,

  1. word -> identifier
  2. identifier -> word

token_lookup

Was to replace the puctuation with special keywords.

batch_data

Convert the whole training script into training dataset which will be fed to the model in multiple batches. Based on the given sequence_length I use DataLoader from pytorch to create the batches

RNN Model

The Neural Network model comprises:

  • Embedding layer
  • LSTM RNN layer
  • Linear classifcation layer In the notebook you will see how I tried with different hyperparamters. The best result produced a loss score 2.88 over the last batch.

generate

Finally used the generate function to generate another random script by using the trained model.

About

Example of an LSTM RNN network to generate random TV scripts

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published