Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🐳 : Further Docker build Cleanup & Docs Update #1502

Merged
merged 5 commits into from
Jan 6, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 11 additions & 1 deletion .dockerignore
Original file line number Diff line number Diff line change
@@ -1,7 +1,17 @@
node_modules
**/.circleci
**/.editorconfig
**/.dockerignore
**/.git
**/.DS_Store
**/.vscode
**/node_modules

# Specific patterns to ignore
data-node
meili_data*
librechat*
Dockerfile*
docs

# Ignore all hidden files
.*
4 changes: 2 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,10 @@ WORKDIR /app

# Allow mounting of these files, which have no default
# values.
RUN touch .env librechat.yaml
RUN touch .env
# Install call deps - Install curl for health check
RUN apk --no-cache add curl && \
npm ci
npm ci

# React client build
ENV NODE_OPTIONS="--max-old-space-size=2048"
Expand Down
5 changes: 5 additions & 0 deletions docker-compose.override.yml.example
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,11 @@ version: '3.4'
# # SAVE THIS FILE AS 'docker-compose.override.yaml'
# # AND USE THE 'docker-compose build' & 'docker-compose up -d' COMMANDS AS YOU WOULD NORMALLY DO

# # USE LIBRECHAT CONFIG FILE
# api:
# volumes:
# - ./librechat.yaml:/app/librechat.yaml

# # BUILD FROM LATEST IMAGE
# api:
# image: ghcr.io/danny-avila/librechat-dev:latest
Expand Down
1 change: 0 additions & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,6 @@ services:
volumes:
- ./.env:/app/.env
- ./images:/app/client/public/images
- ./librechat.yaml:/app/librechat.yaml
mongodb:
container_name: chat-mongodb
image: mongo
Expand Down
2 changes: 2 additions & 0 deletions docs/install/configuration/ai_setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,6 +48,8 @@ Using the default environment values from [/.env.example](https://github.com/dan

This guide will walk you through setting up each Endpoint as needed.

For **custom endpoint** configuration, such as adding [Mistral AI](https://docs.mistral.ai/platform/client/) or [Openrouter](https://openrouter.ai/) refer to the **[librechat.yaml configuration guide](./custom_config.md)**.

**Reminder: If you use docker, you should [rebuild the docker image (here's how)](dotenv.md) each time you update your credentials**

*Note: Configuring pre-made Endpoint/model/conversation settings as singular options for your users is a planned feature. See the related discussion here: [System-wide custom model settings (lightweight GPTs) #1291](https://github.com/danny-avila/LibreChat/discussions/1291)*
Expand Down
136 changes: 107 additions & 29 deletions docs/install/configuration/custom_config.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,29 @@
---
title: 🖥️ Custom Endpoints & Config
description: Comprehensive guide for configuring the `librechat.yaml` file AKA the LibreChat Config file. This document is your one-stop resource for understanding and customizing endpoints & other integrations.
weight: -10
---

# LibreChat Configuration Guide

This document provides detailed instructions for configuring the `librechat.yaml` file used by LibreChat.
Welcome to the guide for configuring the **librechat.yaml** file in LibreChat.

This file enables the integration of custom AI endpoints, enabling you to connect with any AI provider compliant with OpenAI API standards.

This includes providers like [Mistral AI](https://docs.mistral.ai/platform/client/), as well as reverse proxies that facilitate access to OpenAI servers, adding them alongside existing endpoints like Anthropic.

In future updates, some of the configurations from [your `.env` file](./dotenv.md) will migrate here.
![image](https://github.com/danny-avila/LibreChat/assets/110412045/fd0d2307-008f-4e1d-b75b-4f141070ce71)

Further customization of the current configurations are also planned.
Future updates will streamline configuration further by migrating some settings from [your `.env` file](./dotenv.md) to `librechat.yaml`.

Stay tuned for ongoing enhancements to customize your LibreChat instance!

# Table of Contents

1. [Intro](#librechat-configuration-guide)
- [Configuration Overview](#configuration-overview)
- [Setup](#setup)
- [Docker Setup](#docker-setup)
- [Config Structure](#config-structure)
- [1. Version](#1-version)
- [2. Cache Settings](#2-cache-settings)
- [3. Endpoints](#3-endpoints)
Expand All @@ -19,10 +33,39 @@ Further customization of the current configurations are also planned.
- [Breakdown of Default Params](#breakdown-of-default-params)
- [Example Config](#example-config)

## Configuration Overview
## Setup

**The `librechat.yaml` file should be placed in the root of the project where the .env file is located.**

You can copy the [example config file](#example-config) as a good starting point while reading the rest of the guide.

The `librechat.yaml` file contains several key sections.
The example config file has some options ready to go for Mistral AI and Openrouter.

## Docker Setup

For Docker, you need to make use of an [override file](./docker_override), named `docker-compose.override.yml`, to ensure the config file works for you.

- First, make sure your containers stop running with `docker-compose down`
- Create or edit existing `docker-compose.override.yml` at the root of the project:

```yaml
# For more details on the override file, see the Docker Override Guide:
# https://docs.librechat.ai/install/configuration/docker_override.html

version: '3.4'

services:
api:
volumes:
- ./librechat.yaml:/app/librechat.yaml
```

- Start docker again, and you should see your config file settings apply
```bash
docker-compose up # no need to rebuild
```

## Config Structure

**Note:** Fields not specifically mentioned as required are optional.

Expand All @@ -48,36 +91,61 @@ The `librechat.yaml` file contains several key sections.
- **Description**: Each object in the array represents a unique endpoint configuration.
- **Required**

#### Endpoint Object Structure
## Endpoint Object Structure
Each endpoint in the `custom` array should have the following structure:

- **name**: A unique name for the endpoint.
```yaml
# Example Endpoint Object Structure
endpoints:
custom:
- name: "Mistral"
apiKey: "${YOUR_ENV_VAR_KEY}"
baseURL: "https://api.mistral.ai/v1"
models:
default: ["mistral-tiny", "mistral-small", "mistral-medium"]
titleConvo: true
titleModel: "mistral-tiny"
summarize: false
summaryModel: "mistral-tiny"
forcePrompt: false
modelDisplayLabel: "Mistral"
addParams:
safe_mode: true
dropParams: ["stop", "temperature", "top_p"]
```

### **name**:
> A unique name for the endpoint.
- Type: String
- Example: `name: "Mistral"`
- **Required**
- **Note**: Will be used as the "title" in the Endpoints Selector

- **apiKey**: Your API key for the service. Can reference an environment variable, or allow user to provide the value.
### **apiKey**:
> Your API key for the service. Can reference an environment variable, or allow user to provide the value.
- Type: String (apiKey | `"user_provided"`)
- **Example**: `apiKey: "${MISTRAL_API_KEY}"` | `apiKey: "your_api_key"` | `apiKey: "user_provided"`
- Example: `apiKey: "${MISTRAL_API_KEY}"` | `apiKey: "your_api_key"` | `apiKey: "user_provided"`
- **Required**
- **Note**: It's highly recommended to use the env. variable reference for this field, i.e. `${YOUR_VARIABLE}`

- **baseURL**: Base URL for the API. Can reference an environment variable, or allow user to provide the value.
### **baseURL**:
> Base URL for the API. Can reference an environment variable, or allow user to provide the value.
- Type: String (baseURL | `"user_provided"`)
- **Example**: `baseURL: "https://api.mistral.ai/v1"` | `baseURL: "${MISTRAL_BASE_URL}"` | `baseURL: "user_provided"`
- Example: `baseURL: "https://api.mistral.ai/v1"` | `baseURL: "${MISTRAL_BASE_URL}"` | `baseURL: "user_provided"`
- **Required**
- **Note**: It's highly recommended to use the env. variable reference for this field, i.e. `${YOUR_VARIABLE}`

- **iconURL**: The URL to use as the Endpoint Icon.
### **iconURL**:
> The URL to use as the Endpoint Icon.
- Type: Boolean
- Example: `iconURL: https://github.com/danny-avila/LibreChat/raw/main/docs/assets/LibreChat.svg`
- **Note**: The following are "known endpoints" (case-insensitive), which have icons provided for them. If your endpoint `name` matches these, you should omit this field:
- **Note**: The following are "known endpoints" (case-insensitive), which have icons provided for them. If your endpoint `name` matches the following names, you should omit this field:
- "Mistral"
- "OpenRouter"

- **models**: Configuration for models.
- **Required**
### **models**:
> Configuration for models.
- **Required**
- **default**: An array of strings indicating the default models to use. At least one value is required.
- Type: Array of Strings
- Example: `default: ["mistral-tiny", "mistral-small", "mistral-medium"]`
Expand All @@ -87,44 +155,54 @@ Each endpoint in the `custom` array should have the following structure:
- Example: `fetch: true`
- **Note**: May cause slowdowns during initial use of the app if the response is delayed. Defaults to `false`.

- **titleConvo**: Enables title conversation when set to `true`.
### **titleConvo**:
> Enables title conversation when set to `true`.
- Type: Boolean
- Example: `titleConvo: true`

- **titleMethod**: Chooses between "completion" or "functions" for title method.
### **titleMethod**:
> Chooses between "completion" or "functions" for title method.
- Type: String (`"completion"` | `"functions"`)
- Example: `titleMethod: "completion"`
- **Note**: Defaults to "completion" if omitted.

- **titleModel**: Specifies the model to use for titles.
### **titleModel**:
> Specifies the model to use for titles.
- Type: String
- Example: `titleModel: "mistral-tiny"`
- **Note**: Defaults to "gpt-3.5-turbo" if omitted. May cause issues if "gpt-3.5-turbo" is not available.

- **summarize**: Enables summarization when set to `true`.
### **summarize**:
> Enables summarization when set to `true`.
- Type: Boolean
- Example: `summarize: false`
- **Note**: This feature requires an OpenAI Functions compatible API

- **summaryModel**: Specifies the model to use if summarization is enabled.
### **summaryModel**:
> Specifies the model to use if summarization is enabled.
- Type: String
- Example: `summaryModel: "mistral-tiny"`
- **Note**: Defaults to "gpt-3.5-turbo" if omitted. May cause issues if "gpt-3.5-turbo" is not available.

- **forcePrompt**: If `true`, sends a `prompt` parameter instead of `messages`.
### **forcePrompt**:
> If `true`, sends a `prompt` parameter instead of `messages`.
- Type: Boolean
- Example: `forcePrompt: false`
- **Note**: This combines all messages into a single text payload, [following OpenAI format](https://github.com/pvicente/openai-python/blob/main/chatml.md), and uses the `/completions` endpoint of your baseURL rather than `/chat/completions`.
- **Note**: This combines all messages into a single text payload, [following OpenAI format](https://github.com/pvicente/openai-python/blob/main/chatml.md), and

- **modelDisplayLabel**: The label displayed in messages next to the Icon for the current AI model.
uses the `/completions` endpoint of your baseURL rather than `/chat/completions`.

### **modelDisplayLabel**:
> The label displayed in messages next to the Icon for the current AI model.
- Type: String
- Example: `modelDisplayLabel: "Mistral"`
- **Note**: The display order is:
- 1. Custom name set via preset (if available)
- 2. Label derived from the model name (if applicable)
- 3. This value, `modelDisplayLabel`, is used if the above are not specified. Defaults to "AI".

- **addParams**: Adds additional parameters to requests.
### **addParams**:
> Adds additional parameters to requests.
- Type: Object/Dictionary
- **Description**: Adds/Overrides parameters. Useful for specifying API-specific options.
- **Example**:
Expand All @@ -133,12 +211,12 @@ Each endpoint in the `custom` array should have the following structure:
safe_mode: true
```

- **dropParams**: Removes default parameters from requests.
### **dropParams**:
> Removes [default parameters](#default-parameters) from requests.
- Type: Array/List of Strings
- **Description**: Excludes specified default parameters. Useful for APIs that do not accept or recognize certain parameters.
- **Description**: Excludes specified [default parameters](#default-parameters). Useful for APIs that do not accept or recognize certain parameters.
- **Example**: `dropParams: ["stop", "temperature", "top_p"]`
- **Note**: For a list of default parameters sent with every request, see the "Default Parameters" Section below.

- **Note**: For a list of default parameters sent with every request, see the ["Default Parameters"](#default-parameters) Section below.
## Additional Notes
- Ensure that all URLs and keys are correctly specified to avoid connectivity issues.

Expand Down
13 changes: 12 additions & 1 deletion docs/install/configuration/docker_override.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,18 @@ Open your `docker-compose.override.yml` file with vscode or any text editor.

Make your desired changes by uncommenting the relevant sections and customizing them as needed.

For example, if you want to use a prebuilt image for the `api` service and expose MongoDB's port, your `docker-compose.override.yml` might look like this:
For example, if you want to make sure Docker can use your `librechat.yaml` file for [custom configuration](./custom_config.md), it would look like this:

```yaml
version: '3.4'

services:
api:
volumes:
- ./librechat.yaml:/app/librechat.yaml
```

Or, if you want to use a prebuilt image for the `api` service and expose MongoDB's port, your `docker-compose.override.yml` might look like this:

```yaml
version: '3.4'
Expand Down
2 changes: 1 addition & 1 deletion docs/install/configuration/dotenv.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
title: ⚙️ Environment Variables
description: Comprehensive guide for configuring your application's environment with the `.env` file. This document is your one-stop resource for understanding and customizing the environment variables that will shape your application's behavior in different contexts.
weight: -10
weight: -11
---

# .env File Configuration
Expand Down
2 changes: 1 addition & 1 deletion docs/install/configuration/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ weight: 2
# Configuration

* ⚙️ [Environment Variables](./dotenv.md)
* 🖥️ [Custom Config & Endpoints](./configuration/custom_config.md)
* 🖥️ [Custom Endpoints & Config](./custom_config.md)
* 🐋 [Docker Compose Override](./docker_override.md)
---
* 🤖 [AI Setup](./ai_setup.md)
Expand Down
2 changes: 1 addition & 1 deletion docs/install/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ weight: 1
## **[Configuration](./configuration/index.md)**

* ⚙️ [Environment Variables](./configuration/dotenv.md)
* 🖥️ [Custom Config & Endpoints](./configuration/custom_config.md)
* 🖥️ [Custom Endpoints & Config](./configuration/custom_config.md)
* 🐋 [Docker Compose Override](./configuration/docker_override.md)
* 🤖 [AI Setup](./configuration/ai_setup.md)
* 🚅 [LiteLLM](./configuration/litellm.md)
Expand Down
36 changes: 36 additions & 0 deletions docs/install/installation/container_install.md
Original file line number Diff line number Diff line change
Expand Up @@ -222,4 +222,40 @@ podman stop librechat && systemctl --user start container-librechat

---

## Integrating the Configuration File in Podman Setup

When using Podman for setting up LibreChat, you can also integrate the [`librechat.yaml` configuration file](../configuration/custom_config.md).

This file allows you to define specific settings and AI endpoints, such as Mistral AI, tailoring the application to your needs.

After creating your `.env` file as detailed in the previous steps, follow these instructions to integrate the `librechat.yaml` configuration:

1. Place your `librechat.yaml` file in your project's root directory.
2. Modify the Podman run command for the LibreChat container to include a volume argument that maps the `librechat.yaml` file inside the container. This can be done by adding the following line to your Podman run command:

```bash
-v "./librechat.yaml:/app/librechat.yaml"
```

For example, the modified Podman run command for starting LibreChat will look like this:

```bash
podman run \
--name="librechat" \
--network=librechat \
--env-file="./.env" \
-v "./librechat.yaml:/app/librechat.yaml" \
-p 3080:3080 \
--detach \
librechat:local;
```

By mapping the `librechat.yaml` file into the container, Podman ensures that your custom configurations are applied to LibreChat, enabling a tailored AI experience.

Ensure that the `librechat.yaml` file is correctly formatted and contains valid settings.

Any errors in this file might affect the functionality of LibreChat. For more information on configuring `librechat.yaml`, refer to the [configuration guide](../configuration/custom_config.md).

---

>⚠️ Note: If you're having trouble, before creating a new issue, please search for similar ones on our [#issues thread on our discord](https://discord.gg/weqZFtD9C4) or our [troubleshooting discussion](https://github.com/danny-avila/LibreChat/discussions/categories/troubleshooting) on our Discussions page. If you don't find a relevant issue, feel free to create a new one and provide as much detail as possible.
3 changes: 3 additions & 0 deletions docs/install/installation/docker_compose_install.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,9 @@ Before running LibreChat with Docker, you need to configure some settings:
#### [AI Setup](../configuration/ai_setup.md) (Required)
At least one AI endpoint should be setup for use.

#### [Custom Endpoints & Configuration](../configuration/custom_config.md#docker-setup) (Optional)
Allows you to customize AI endpoints, such as Mistral AI, and other settings to suit your specific needs.

#### [Manage Your MongoDB Database](../../features/manage_your_database.md) (Optional)
Safely access and manage your MongoDB database using Mongo Express

Expand Down