-
Oomnitza
- Tuam, Ireland
-
18:31
(UTC -12:00)
ML/AI 🤖 🧠
Curated list of awesome tools, demos, docs for ChatGPT and GPT-3
The standard data-centric AI package for data quality and machine learning with messy, real-world data and labels.
DeepSpeech is an open source embedded (offline, on-device) speech-to-text engine which can run in real time on devices ranging from a Raspberry Pi 4 to high power GPU servers.
The simplest, fastest repository for training/finetuning medium-sized GPTs.
Examples and guides for using the OpenAI API
🦜🔗 Build context-aware reasoning applications
Train transformer language models with reinforcement learning.
Pytorch实现的流式与非流式的自动语音识别框架,同时兼容在线和离线识别,目前支持Conformer、Squeezeformer、DeepSpeech2模型,支持多种数据增强方法。
The simplest way to serve AI/ML models in production
Free, ultrafast Copilot alternative for Vim and Neovim
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
An explainable inference software supporting annotated, real valued, graph based and temporal logic
Build LLM apps in Typescript/Javascript. 🧑💻 🧑💻 🧑💻 🚀 🚀 🚀
Record and replay LLM interactions for langchain
Turn your rough sketch into a refined image using AI
PyGWalker: Turn your pandas dataframe into an interactive UI for visual analysis
Running large language models on a single GPU for throughput-oriented scenarios.
Hopsworks - Data-Intensive AI platform with a Feature Store
High accuracy RAG for answering questions from scientific documents with citations
Data and code for "DocPrompting: Generating Code by Retrieving the Docs" @ICLR 2023
Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Official PyTorch implementation of VoxFormer [CVPR 2023 Highlight]
brat rapid annotation tool (brat) - for all your textual annotation needs
Lazy Predict help build a lot of basic models without much code and helps understand which models works better without any parameter tuning
🐙 Guides, papers, lecture, notebooks and resources for prompt engineering
A collection of libraries to optimise AI model performances
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RN…