Welcome to the Natural Language Processing (NLP) Repository! Here, you'll find a variety of projects related to NLP, an exciting field focused on the interaction between computers and human language.
This repository explores a wide range of projects related to natural language processing. If a notebook requires a dataset, a download link will be provided, or the dataset will be stored in the data folder in its corresponding section.
- Multi class text classification - fine tuning-distilbert.ipynb
- Sentiment Analysis - fine tunning-distilbert.ipynb
| Preview | About |
|---|---|
| 💥 Basic text classification 💥 This project serves as an introduction to PyTorch and Weights & Biases (wandb) by implementing and experimenting with deep learning models for image and text classification. The primary objectives include understanding PyTorch’s computation graphs, implementing a basic classifier, logging experiments with wandb, and modifying the baseline model to enhance performance. | |
| 💥 Text Generation 💥 This project explores generative text models, focusing on Recurrent Neural Networks (RNNs) and Transformer-based language models. It involves implementing and experimenting with architectures such as bidirectional RNNs, Transformers, Sliding Window Attention, Rotary Positional Embeddings (RoPE), and Grouped Query Attention (GQA). The goal is to understand how different generative techniques impact language modeling and computational efficiency. | |
| 💥 Fine-Tunning GPT2 with LoRA 💥 This repository implements Low-Rank Adaptation (LoRA), a Parameter-Efficient Fine-Tuning (PEFT) method, to efficiently fine-tune a pre-trained GPT-2 model. |



