Skip to content

xavialex/NLP-Transformers-Demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NLP Transformers Demo

Deep Learning architectures based in Transformers (Vaswani et al., 2017, Attention is all you need) have demonstrated state of the art results in Natural Language Processing (NLP) and are quickly jumping to other fields (DEtection TRansformer). This notebook aims to demonstrate the power and easiness of use of the HuggingFace Pipelines, which can be quickly deployed to achieve outstanding results in several NLP tasks, including Sentiment Classification, Text Generation, Summarization and more.

question_answering

Dependencies

Running the application can be done following the instructions above:

  1. To create a Python Virtual Environment (virtualenv) to run the code, type:

    python3 -m venv my-env

  2. Activate the new environment:

    • Windows: my-env\Scripts\activate.bat
    • macOS and Linux: source my-env/bin/activate
  3. Install all the dependencies from requirements.txt:

    pip install -r requirements.txt

  4. To make the environment visible as a kernel for Jupyter, type:

    python -m ipykernel install --name=my-env

Use

To run the notebooks, with the environment activated (see Dependencies section), create an IPyKernel running this instruction:

python -m ipykernel install --user --name=myenv

Check if your kernel was created by listing them:

jupyter kernelspec list

Finally, inside the notebook, and from a base environment where Jupyter is installed, select your kernel to make all packages needed for the project ready to import.

The NLP_Transformers_Pipelines implements several NLP Pipelines to experiment with. The NLP_App, on the other hand, provides small GUI's for the user to explore the Transformers capabilities.

About

Demo of the Transformers Pipelines by HuggingFace

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published