ComfyUI docker images for use in GPU cloud and local environments. Includes AI-Dock base for authentication and improved user experience.
-
Updated
Nov 4, 2024 - Shell
ComfyUI docker images for use in GPU cloud and local environments. Includes AI-Dock base for authentication and improved user experience.
RunPod serverless worker for Fooocus-API. Standalone or with network volume
The Big List of Protests - An AI-assisted Protest Flyer parser and event aggregator
Streamlit web app for scheduling RunPod serverless models with automatic cronjobs to prevent cold starts. Includes Slack notifications and real-time monitoring.
Production-ready RunPod serverless endpoint and pod for Qwen-Image (20B) - Text-to-image generation with exceptional English and Chinese text rendering
RunPod Serverless Worker for the Stable Diffusion WebUI Forge API
Runpod-LLM provides ready-to-use container scripts for running large language models (LLMs) easily on RunPod.
A Rust SDK implementation of the Runpod API that enables seamless integration of GPU infrastructure into your applications, workflows, and automation systems.
Production-ready RunPod serverless endpoint for Kokoro TTS. Features high-quality text-to-speech, voice mixing, word-level timestamps, and phoneme generation. Optimized for fast cold starts and auto-scaling.
Headless threejs using Puppeteer
RunPod serverless worker for the vLLM AI text-gen inference. Simple, optimized and customisable.
Build and deploy the PGCView pipeline endpoint in a RunPod serverless GPU environment.
This repository contains the runpod serverless component of the SDGP project "quizzifyme"
Adds diarization to faster-whisper Runpod worker
A Chrome extension that helps improve reading comprehension by generating an interactive, multiple choice quiz for any website
This is an AI/ML Project Deployment Template, which uses Litserve & Runpod as backend service & gradio for UI.
Python client script for sending and save prompt to A1111 serverless workers endpoints
This project hosts the LLaMA 3.1 CPP model on RunPod's serverless platform using Docker. It features a Python 3.11 environment with CUDA 12.2, enabling scalable AI request processing through configurable payload options and GPU support.
MLOps library for LLM deployment w/ the vLLM engine on RunPod's infra.
RunPod serverless function for voice conversion using RVC-v2 (Retrieval-based Voice Conversion)
Add a description, image, and links to the runpod-serverless topic page so that developers can more easily learn about it.
To associate your repository with the runpod-serverless topic, visit your repo's landing page and select "manage topics."