PhoGPT: Generative Pre-training for Vietnamese (2023)
-
Updated
Nov 12, 2024 - Python
PhoGPT: Generative Pre-training for Vietnamese (2023)
An autoregressive language model like ChatGPT.
HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer
A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.
Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by-step, we’ll construct a GPT model from the ground up, demystifying its architecture and bringing its mechanics to life through hands-on coding.
Simple GPT app that uses the falcon-7b-instruct model with a Flask front-end.
An Industrial Project about NLP in Finance Application
A implimentation of GPT2 varient.
ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.
A Generatively Pretrained Transformer that generates Shakespeare-eque quotes
Repository for all things Natural Language Processing
(GPT-1) | Generative Pre-trained Transformer - 1
This is a NLP coursework repository for the Honours Bachelor of Artificial Intelligence program at Durham College. This repository contains weekly labs, assignments, and the final project completed during the Winter 2024 term.
PyTorch implementation of GPT from scratch
Repository for personal experiments
I built a GPT model from scratch to generate text
Add a description, image, and links to the generative-pre-trained-transformer topic page so that developers can more easily learn about it.
To associate your repository with the generative-pre-trained-transformer topic, visit your repo's landing page and select "manage topics."