Skip to content

SkAndMl/transformers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Building Transformers

Run the following command once the repository has been cloned

pip install -r requirements.txt

BERTMaskedLM

You can train the BERTMaskedLM model built by running the following command

python train.py

Or you can load the pre-trained weigths using

bert = BERTMaskedLM(config=config, vocab_size=vocab_size)
bert.load_state_dict(torch.load("../weights/bert_masked_lm.pt",
                                map_location=torch.device(device=device)))

After training for 30 epochs the model was able to achieve an accuracy of 88.58% and a loss of 0.4307. The results are shown below

bert_masked_lm_performance

PoemGPT

PoemGPT is a decoder-only transformer model that generates poems similar to that of Shakespeare's You can generate poems by running the following command in the terminal

python gen_poem.py --num_tokens <num_chars in poem>
PRINCE PEY:
Very well! I will be naked to take when.

MARIANA:
Now sword? and who have friend out of the place
Artend I of your vental, and am not pratise
He disture friends in Playets to the comprove.

LUCIO:
By you women, I do put on cominition.
And whils woratil orp we moforey any
Moris: y maingive, os wacking, waras, t, ouns
Olyonesig? fagonad! cin, t, s,
I; fo, foce lelinsts!-t tate nope's; war!
Trimply titaved ps, ge
Fingeivedy, bequbupe, po.
Trhokeringe at bous ot: bys; wined iooualy; on

Reference links for this repository

About

Building out transformers from scratch

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published