Skip to content
/ gpt Public

Implementation of a scalled down ChatGPT-like transformer pretraining using PyTorch

Notifications You must be signed in to change notification settings

mcpeixoto/gpt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Shakespere Generation

This is a "simple" implementation of a character-level Transformed model. This is meant to be a scalled down version of the pretraining pipeline used on ChatGPT.

The model was trained on the Shakespere dataset, which is a collection of 37 plays by William Shakespere. The dataset is available on Kaggle.

Example of the Generated Text

Savester,
By my good friends.

POLIXENES:
Shy cannot call, madam you to undertake.
Ay, where are therea-faced a royal oracum,
Made hear here in means, poor else may nestire.
Here be return's womage pay, ladies
To any give either.

HENRY BOLINGBROKE:
Nor mindow, to God!

KING RICHARD III:
My lord, indeed;
In mine eye, be graced pardon'd, and out of your soldier
Nor yours.

RATCLIFF:
Cominius!
I am go. A bidden of one that you we should
Lord, you may converture ired Titus;
Which I am sure and man 

This is still under development, so the results are not that good.

About

Implementation of a scalled down ChatGPT-like transformer pretraining using PyTorch

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published