A Next-Word Prediction project uses Transformers and GPT-2 for text generation. GPTTokenizer preprocesses input, and the model is fine-tuned. Evaluation measures accuracy, perplexity, and fluency.
machine-learning tokenizer text-analysis deeplearning language-model nltk-python gpt-2 huggingface-transformers nltk-corpus transfromers gptlhheadmodel
-
Updated
May 16, 2025 - Jupyter Notebook