An AI model that predicts the next word in a sentence using a word-level LSTM.
-
🔠 Word-level tokenization for deeper semantic understanding.

-
🤖 LSTM-based Sequential model trained on Harry Potter and Percy Jackson books.

-
📊 Model comparison across different LSTM layers and architectures in
Model_Comparisons.csv. -
📉 Loss graphs included for visualizing training performance.

-
💾 Demo predicts the next word based on custom input.

Note: Sometimes makes predictions that cannot be found in the original dictionary

- Clone the repository:
git clone https://github.com/RJ601/Next-Word-Prediction-Using-Word-Level-LSTM.git
cd Next-Word-Prediction-Using-Word-Level-LSTM- Install dependencies:
pip install -r requirements.txt👉 Open LSTM_Model_Training.ipynb, optionally modify the hyperparameters, and run all cells.
- Platform: Jupyter Notebook
- Language: Python 3.9.23
Libraries Used:
tensorflow==2.19.0keras==3.10.0scikit-learn==1.6.1matplotlib==3.9.4numpy==2.0.2pandas==2.3.0nltk==3.9.1
This project is licensed under the MIT License. Feel free to use, modify, or share it with attribution.