Skip to content
@evNLP

Evolving NLP

Tareas de Procesamiento del Lenguaje Natural

Welcome to evNLP 🚀

At evNLP, we’re crafting the future of Natural Language Processing. Whether it’s experimenting with embeddings, optimizing Transformers, or creating state-of-the-art classifiers, our repositories are here to empower researchers, developers, and enthusiasts. Dive in and let’s revolutionize NLP together.


🛠️ Our Projects

Here’s what we’ve been building to push the boundaries of NLP:


Step into the world of Transformers and unlock the power of self-attention mechanisms. This repository is your playground for understanding and implementing the building blocks of modern NLP.
Why You’ll Love It:

  • Build Transformer self-attention layers from scratch in PyTorch.
  • Explore examples for tasks like summarization, text classification, and translation.
  • Use this as a foundation to create models like BERT or GPT.

Combine the simplicity of LSTMs with the precision of attention mechanisms. This repo is perfect for tackling sequence tasks like machine translation with added focus on the most important input features.
Why You’ll Love It:

  • Implement attention over LSTMs to highlight key text elements.
  • Optimize translation workflows for tasks like English-to-Spanish models.
  • PyTorch examples included to get you started in no time.

Adapt pre-trained models like GPT-2 to your specific tasks with ease. Whether it’s generating creative content or refining a chatbot, this repo has you covered.
Why You’ll Love It:

  • Scripts for fine-tuning GPT-2 on custom datasets.
  • Step-by-step examples for text generation, classification, and more.
  • Designed for developers who want to squeeze more out of their language models.

From traditional methods to deep learning, this repository covers it all for text classification. Ideal for building sentiment analysis pipelines, category classifiers, and more.
Why You’ll Love It:

  • Compare Naive Bayes, logistic regression, and neural networks.
  • Build projects in both PyTorch and sklearn.
  • Hands-on examples for real-world applications.

Take your NLP to the next level with GloVe embeddings. Represent words in rich vector spaces that capture their context and relationships.
Why You’ll Love It:

  • Integrate GloVe embeddings into LSTM-based models or classifiers.
  • Pre-trained embeddings ready to supercharge sentiment analysis and more.
  • Clear PyTorch examples to make implementation seamless.

Turn words into numbers with the Bag of Words (BOW) approach. Perfect for quick prototyping and lightweight classification tasks.
Why You’ll Love It:

  • Generate BOW matrices from any text corpus.
  • Compare documents using similarity metrics like cosine distance.
  • Great for classification, retrieval, and simple text analysis.

🌍 Why Choose evNLP?

Because we don’t just build repositories—we create tools that solve real-world NLP challenges. At evNLP, our goal is to make cutting-edge NLP accessible, practical, and powerful for everyone.


🤝 How to Get Involved

We’d love your input! Here’s how you can contribute:

  1. Fork a repo and create your own branch.
  2. Add your improvements, fixes, or experiments.
  3. Submit a pull request—we’ll review it and collaborate to make it even better.

Whether you’re a researcher, developer, or enthusiast, there’s a place for you at evNLP.


Let’s build the future of NLP together. evNLP—where language meets innovation. 🌟

Pinned Loading

  1. SelfAttention SelfAttention Public

    Transformer Model Implementation in PyTorch

    Jupyter Notebook

  2. Instruct Instruct Public

    Instructing Mistral and GPT2 for a specific task

    Jupyter Notebook

Repositories

Showing 7 of 7 repositories

Top languages

Loading…

Most used topics

Loading…