Skip to content
#

distilbert

Here are 8 public repositories matching this topic...

In this notebook, we will demonstrate the process of fine-tuning DistilBERT for sentiment analysis using a dataset of restaurant reviews. DistilBERT is a smaller, faster, and lighter version of BERT (Bidirectional Encoder Representations from Transformers), an encoder-based transformer model introduced by Google in 2018.

  • Updated Jul 29, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the distilbert topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the distilbert topic, visit your repo's landing page and select "manage topics."

Learn more