-
Notifications
You must be signed in to change notification settings - Fork 28
Open
Description
Hello,
I am trying to get a docker setup running with CPU and GPU that has both Chainlit application (persist data with postgres) and Indexer UI enabled, I dont need RAY related features right now. I want to be able to provide LLM, VLM and EMBEDDER environment variables below that point to my own endpoints and not within VLLM or OLLAMA from inside the docker. So, I dont want to install VLLM or Ollama in the docker-compose (Good to use infinity server for re-ranking model).
Appreciate any help to get docker-compose.yaml + env file that can be used to achieve this one.
LLM - For conversation
BASE_URL=
API_KEY=
MODEL=
VLM - For image interpretation
VLM_BASE_URL=
VLM_API_KEY=
VLM_MODEL=
EMBEDDER - For text vectorization
EMBEDDER_BASE_URL=
EMBEDDER_MODEL_NAME=
EMBEDDER_API_KEY=
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels