Skip to content

aarontong95/HangmanAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Solving Hangman Game using Neural Network

alt text
The solution is inspired by the training approach of BERT which is MLM(Masked Language Model). It is a fill-in-the-blank task, where a model uses the context words surrounding a mask token to try to predict what the masked word should be. For example:

Input Text: Hangman is a classic [MASK] game 
Label: [MASK] = word

In the game Hangman, it would be:

Input Text: HANG#AN
Label: # = M`

ENVIRONMENT

  • python3.6

Clone the Project

git clone https://github.com/aarontong95/HangmanAI.git

Training

64 GB ram is needed for training. Otherwise you can lower the value of FRAC in config.py which is the proportion of the train split

python train.py

Testing

The successful rate is about 44% in the out of sample testing

python test.py

What have implemented

  • Generate training sample for the model to learn (preprocess.py)
  • Bidirectional LSTM as the model (model.py)
  • Hangman game enviornment (game.py)
  • Play Hangman game with the model (palyer.py)
  • More details in Solution.ipynb

About

Solving Hangman Game using Neural Network

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published