Distilling GPT2 to create smaller models that are experts in different domains
-
Notifications
You must be signed in to change notification settings - Fork 0
codechefvitcc/Distill_gpt2
About
Distilling GPT2 to create smaller models that are experts in different domains
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published