Cog demo for GPT-2 finetuned on World of Warcraft quests
The weights for this demo can be downloaded from here.
The contents of this rar should be extracted in a folder checkpoint/gpt2-wow
.
Open access paper in the ACM digital library: Fine-tuning GPT-2 on annotated RPG quests for NPC dialogue generation
@inproceedings{10.1145/3472538.3472595,
author = {van Stegeren, Judith and My\'{s}liwiec, Jakub},
title = {Fine-tuning GPT-2 on annotated RPG quests for NPC dialogue generation},
year = {2021},
isbn = {9781450384223},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3472538.3472595},
doi = {10.1145/3472538.3472595},
abstract = {GPT-2, a neural language model trained on a large dataset of English web text, has been used in a variety of natural language generation tasks because of the language quality and coherence of its outputs. In order to investigate the usability of GPT-2 for text generation for video games, we fine-tuned GPT-2 on a corpus of video game quests and used this model to generate dialogue lines for quest-giver NPCs in a role-playing game. We show that the model learned the structure of quests and NPC dialogue, and investigate how the temperature parameter influences the language quality and creativity of the output artifacts. We evaluated our approach with a crowdsource experiment in which human judges were asked to rate hand-written and generated quest texts on language quality, coherence and creativity.},
booktitle = {Proceedings of the 16th International Conference on the Foundations of Digital Games},
articleno = {2},
numpages = {8},
keywords = {quest generation, procedural content generation for games, World of Warcraft, Transformers, Natural language generation, NPC dialogue, MMORPG, GPT-2, English},
location = {Montreal, QC, Canada},
series = {FDG '21}
}