Skip to content

Commit 5226227

Browse files
committed
Add backends docs
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
1 parent b57f399 commit 5226227

File tree

1 file changed

+118
-0
lines changed

1 file changed

+118
-0
lines changed

docs/content/backends.md

Lines changed: 118 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,118 @@
1+
---
2+
title: "Backends"
3+
description: "Learn how to use, manage, and develop backends in LocalAI"
4+
weight: 4
5+
---
6+
7+
# Backends
8+
9+
LocalAI supports a variety of backends that can be used to run different types of AI models. There are core Backends which are included, and there are containerized applications that provide the runtime environment for specific model types, such as LLMs, diffusion models, or text-to-speech models.
10+
11+
## Managing Backends in the UI
12+
13+
The LocalAI web interface provides an intuitive way to manage your backends:
14+
15+
1. Navigate to the "Backends" section in the navigation menu
16+
2. Browse available backends from configured galleries
17+
3. Use the search bar to find specific backends by name, description, or type
18+
4. Filter backends by type using the quick filter buttons (LLM, Diffusion, TTS, Whisper)
19+
5. Install or delete backends with a single click
20+
6. Monitor installation progress in real-time
21+
22+
Each backend card displays:
23+
- Backend name and description
24+
- Type of models it supports
25+
- Installation status
26+
- Action buttons (Install/Delete)
27+
- Additional information via the info button
28+
29+
## Backend Galleries
30+
31+
Backend galleries are repositories that contain backend definitions. They work similarly to model galleries but are specifically for backends.
32+
33+
### Adding a Backend Gallery
34+
35+
You can add backend galleries by specifying the **Environment Variable** `LOCALAI_BACKEND_GALLERIES`:
36+
37+
```bash
38+
export LOCALAI_BACKEND_GALLERIES='[{"name":"my-gallery","url":"https://raw.githubusercontent.com/username/repo/main/backends"}]'
39+
```
40+
The URL needs to point to a valid yaml file, for example:
41+
42+
```yaml
43+
- name: "test-backend"
44+
uri: "quay.io/image/tests:localai-backend-test"
45+
alias: "foo-backend"
46+
```
47+
48+
Where URI is the path to an OCI container image.
49+
50+
### Backend Gallery Structure
51+
52+
A backend gallery is a collection of YAML files, each defining a backend. Here's an example structure:
53+
54+
```yaml
55+
# backends/llm-backend.yaml
56+
name: "llm-backend"
57+
description: "A backend for running LLM models"
58+
uri: "quay.io/username/llm-backend:latest"
59+
alias: "llm"
60+
tags:
61+
- "llm"
62+
- "text-generation"
63+
```
64+
65+
## Pre-installing Backends
66+
67+
You can pre-install backends when starting LocalAI using the `LOCALAI_EXTERNAL_BACKENDS` environment variable:
68+
69+
```bash
70+
export LOCALAI_EXTERNAL_BACKENDS="llm-backend,diffusion-backend"
71+
local-ai run
72+
```
73+
74+
## Creating a Backend
75+
76+
To create a new backend, you need to:
77+
78+
1. Create a container image that implements the LocalAI backend interface
79+
2. Define a backend YAML file
80+
3. Publish your backend to a container registry
81+
82+
### Backend Container Requirements
83+
84+
Your backend container should:
85+
86+
1. Implement the LocalAI backend interface (gRPC or HTTP)
87+
2. Handle model loading and inference
88+
3. Support the required model types
89+
4. Include necessary dependencies
90+
5. Have a top level `run.sh` file that will be used to run the backend
91+
6. Pushed to a registry so can be used in a gallery
92+
93+
94+
### Publishing Your Backend
95+
96+
1. Build your container image:
97+
```bash
98+
docker build -t quay.io/username/my-backend:latest .
99+
```
100+
101+
2. Push to a container registry:
102+
```bash
103+
docker push quay.io/username/my-backend:latest
104+
```
105+
106+
3. Add your backend to a gallery:
107+
- Create a YAML entry in your gallery repository
108+
- Include the backend definition
109+
- Make the gallery accessible via HTTP/HTTPS
110+
111+
## Backend Types
112+
113+
LocalAI supports various types of backends:
114+
115+
- **LLM Backends**: For running language models
116+
- **Diffusion Backends**: For image generation
117+
- **TTS Backends**: For text-to-speech conversion
118+
- **Whisper Backends**: For speech-to-text conversion

0 commit comments

Comments
 (0)