AI generation of new pokedex entries.
Specify your desired pokemon type, and the model will output a corresponding pokedex entry!
TODO: examples
We use a distilled version of the OpenAI GPT-2 model. GPT-2 is a transformer-based language model which can generate synthetic text samples in response to being primed with an arbirary input. Network distillation is performed in a process similar to DistilBERT. DistilGP2 is two times faster and 33% smaller than GPT-2, with perplexity dropping from 21.1 to 16.3 in the distilled version.
TODO
- Fine-tune DistilGPT-2 on Poke-Dataset.
python train.py --mode train --data_file poke.txt --output_dir logs
The repository uses the Transformers repository. It provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet, T5, etc.) for Natural Language Understanding (NLU) and Natural Language Generation (NLG).