Notebook (Demonstration) for training Distilbert on Glue, and uploading model to Huggingface.
-
Updated
Dec 16, 2020 - Jupyter Notebook
Notebook (Demonstration) for training Distilbert on Glue, and uploading model to Huggingface.
Notebooks Jupyter conçus pour divers projets
Collection of notebook used during competition Fibe Hack the Vibe 2.0 on HackerEarth
Transformers Workshop on behalf of ML India. Contains resource notebook for training/inferring large scale transformer models for different downstream task
Jupyter Notebook illustrates and compares different approaches to sentence similarity scoring.
A notebook for a medium article about text classification with Hugging Face DistilBert and Tensorflow 2.0
Our goal is to train a classifier that can predict the CEFR level of any given sentence. In this notebook we will use 🤗Hugging Face and its transformers library as the training framework, with Pytorch as the deep learning backend.
In this notebook, we will demonstrate the process of fine-tuning DistilBERT for sentiment analysis using a dataset of restaurant reviews. DistilBERT is a smaller, faster, and lighter version of BERT (Bidirectional Encoder Representations from Transformers), an encoder-based transformer model introduced by Google in 2018.
Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.
To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."