Stars
Unified-Modal Speech-Text Pre-Training for Spoken Language Processing
An Open Source text-to-speech system built by inverting Whisper.
Safe, efficient, and ergonomic bindings to Wolfram LibraryLink and the Wolfram Language
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference,…
In-browser ECDSA aggregation using Nova over secp/secq
Data compression using LSTM in TensorFlow
a hack implementation of CCS generic arithmetization, won a prize at Zuzalu hackathon 2023 despite incompleteness
A curated list of zero-knowledge folding schemes
Code repository for supporting the paper "Atlas Few-shot Learning with Retrieval Augmented Language Models",(https//arxiv.org/abs/2208.03299)
Must read research papers and links to tools and datasets that are related to using machine learning for compilers and systems optimisation
blaze is a Rust library for ZK acceleration on Xilinx FPGAs.
Convert code repos into an LLM prompt-friendly format. Mostly built by GPT-4.
Come join the best place on the internet to learn AI skills. Use code "chatbotui" for an extra 20% off.
AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
A curated list of prompts, tools, and resources regarding the GPT-4 language model.
The interdicplinary of Mathematics and Computer Science, Distinguisehed by its emphasis on mathemtical technique and rigour.
Resources of deep learning for mathematical reasoning (DL4MATH).
A Learning Environment for Theorem Proving with the Coq proof assistant
Implementation of RETRO, Deepmind's Retrieval based Attention net, in Pytorch
Official repository for the paper "Tranception: Protein Fitness Prediction with Autoregressive Transformers and Inference-time Retrieval"
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Stable diffusion for real-time music generation
A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training
Measuring and graphing memory usage of local processes