-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Closed
Description
I just wanted to personally thank everyone involved in this effort. Training is now far more accessible on DALLE-pytorch using the pretrained VAE you provided. Compute and memory costs are substantially lower and it's even possible for people to train a relatively large transformer under 16 GiB of VRAM.
It's early days, and no one has trained a "full DALL-E" yet, but this helps with that plenty and momentum is already picking up on the repo.
So thanks and great work everyone. You're awesome.
rromb, rlallen-nps and AljoSt
Metadata
Metadata
Assignees
Labels
No labels