Skip to content

fszontagh/stable-diffusion.cpp-restapi

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SDCPP-RESTAPI

⚠️ WARNING: This is a vibe coded project!

C++20 REST API server for stable-diffusion.cpp, providing HTTP endpoints for AI-powered image and video generation.

Features

  • Generation: txt2img, img2img, txt2vid, upscaling (ESRGAN)
  • Model Architectures: SD 1.x/2.x, SDXL, Flux, SD3, Wan, Z-Image, Qwen
  • Queue System: Job queue with persistence and WebSocket progress updates
  • Live Preview: Real-time preview during generation (TAE/VAE modes)
  • Web UI: Integrated Vue.js interface
  • Model Downloads: CivitAI and HuggingFace integration
  • LLM Assistant: Ollama-compatible prompt enhancement and chat
  • GPU Backends: CUDA, Vulkan, Metal, ROCm, OpenCL

Quick Start

Prerequisites

  • CMake 3.20+
  • Ninja build system
  • C++20 compiler (GCC 11+ / Clang 14+)
  • OpenSSL, ZLIB
  • GPU drivers (CUDA, Vulkan, Metal, or ROCm)
  • Node.js 16+ (for Web UI, optional)

Build

mkdir build && cd build
cmake .. -G Ninja -DCMAKE_BUILD_TYPE=Release -DSD_CUDA=ON
ninja

GPU backend options:

  • -DSD_CUDA=ON - NVIDIA CUDA
  • -DSD_VULKAN=ON - Vulkan (cross-platform)
  • -DSD_METAL=ON - Apple Metal
  • -DSD_HIPBLAS=ON - AMD ROCm
  • -DSD_OPENCL=ON - OpenCL

Configure

cp config.example.json config.json
# Edit config.json with your model paths

Run

./bin/sdcpp-restapi

Server starts at http://localhost:8080 by default.

Installation (Linux with systemd)

For production deployment on Linux, use the install script which sets up a systemd service:

# Build first (see Quick Start above)
cd build && ninja

# Install as system service
sudo ./scripts/install.sh

Install Script Features

  • Systemd Integration: Automatic startup, restart on failure
  • Interactive Setup: Prompts for server port, model paths, LLM configuration
  • Ollama/LLM Config: Configure prompt enhancement and AI assistant endpoints
  • Directory Structure: Creates organized model subdirectories
  • Config Preservation: Updates preserve existing custom settings
  • Security Hardening: Service runs with restricted permissions

Install Options

sudo ./scripts/install.sh [options]

Options:
  --user USER       Run service as USER (default: current user)
  --host HOST       Server listen address (default: 0.0.0.0)
  --port PORT       Server port (default: 8080)
  --models-dir DIR  Directory for model files
  --output-dir DIR  Directory for generated images
  --no-service      Don't install systemd service
  --update-webui    Quick update of Web UI files only
  --uninstall       Remove the installation

Service Management

sudo systemctl start sdcpp-restapi    # Start
sudo systemctl stop sdcpp-restapi     # Stop
sudo systemctl status sdcpp-restapi   # Status
sudo journalctl -u sdcpp-restapi -f   # View logs

Installation Paths

Path Description
/opt/sdcpp-restapi/ Binary and Web UI
/etc/sdcpp-restapi/config.json Configuration file
<models-dir>/ Model subdirectories (checkpoints, vae, lora, etc.)

Configuration

Edit config.json to set:

  • server: Host, port, thread count
  • paths: Model directories (checkpoints, VAE, LoRA, CLIP, T5, etc.)
  • sd_defaults: Default generation settings
  • preview: Live preview mode and interval
  • ollama: LLM prompt enhancement settings
  • assistant: LLM chat assistant settings

See config.example.json for all options.

API

Full API documentation: docs/API.md

Key Endpoints

Endpoint Description
GET /health Server status and loaded model info
GET /models List available models
POST /models/load Load a model
POST /txt2img Text-to-image generation
POST /img2img Image-to-image generation
POST /txt2vid Text-to-video generation
POST /upscale Image upscaling
GET /queue/{id} Get job status

WebSocket

Real-time updates on port 8081:

  • Job progress and previews
  • Model loading status
  • Server heartbeat

Web UI

Access the integrated web interface at http://localhost:8080/ui

Screenshots

File manager

Képernyőkép_20260104_111732

Webui

  • Dashboard
Képernyőkép_20260104_111600
  • Model management
Képernyőkép_20260104_111554
  • Queue
Képernyőkép_20260104_112318
  • Assistant (ollama)
Képernyőkép_20260104_111611

Documentation

License

MIT License

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published