Skip to content

Files

Latest commit

 

History

History

HuggingFace Transformer

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 

Transformer
In “Attention Is All You Need”, Google introduce the Transformer, a novel neural network architecture based on a self-attention mechanism that we believe to be particularly well suited for language understanding.

A Transformer network applies self-attention mechanism which scans through every word and appends attention scores(weights) to the words. The Transformer was introduced as a simple network architecture, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely.

The various tasks performed in this notebook are :

  • Text generation (in English): provide a prompt and the model will generate what follows.
  • Sentiment analysis: is a text positive or negative?
  • Question answering: provide the model with some context and a question, extract the answer from the context.
  • Summarization: generate a summary of a long text.
  • Language Translation: translate a text into another language.
  • Text Prediction: Predicts the next words for a given sentence.