Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distilgpt2 finetuning and text generation #3177

Closed
wants to merge 1 commit into from

Conversation

tripathiaakash
Copy link
Contributor

This ipynb notebook contains a finetuning and text generation tutorial for distilgpt2. The tutorial also have used code from run_generation.py file to make generation faster than using the original file for every iteration.

This ipynb notebook contains a finetuning and text generation tutorial for distilgpt2. The tutorial also have used code from run_generation.py file to make generation faster than using the original file for every iteration.
@patrickvonplaten patrickvonplaten self-assigned this Mar 9, 2020
@patrickvonplaten
Copy link
Contributor

Hi @Blackjack01 - thanks so much for this great contribution! We are still going through the notebook and discussing how to add it :-) Will let you know soon!

@patrickvonplaten
Copy link
Contributor

Hi @Blackjack01, sorry for the late answer.
We now have community notebooks here: https://github.com/huggingface/transformers/tree/master/notebooks#community-notebooks
Feel free to open a PR to add it there :-)

tripathiaakash added a commit to tripathiaakash/transformers that referenced this pull request Jan 9, 2021
tripathiaakash added a commit to tripathiaakash/transformers that referenced this pull request Jan 9, 2021
tripathiaakash added a commit to tripathiaakash/DistilGPT2-Tutorial that referenced this pull request Jan 9, 2021
tripathiaakash added a commit to tripathiaakash/transformers that referenced this pull request Jan 9, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants