Skip to content

isoftstone-data-intelligence-ai/efflux-backend

Repository files navigation

Efflux - Backend

LLM Agent Chat Client Backend

License Python FastAPI MCP

English | 简体中文


Efflux is an LLM-based Agent chat client featuring streaming responses and chat history management. As an MCP Host, it leverages the Model Context Protocol to connect with various MCP Servers, enabling standardized tool invocation and data access for large language models.

Key Features

  • Rapid Agent construction
  • Dynamic MCP tool loading and invocation
  • Support for multiple large language models
  • Real-time streaming chat responses
  • Chat history management

Online Demo

Requirements

  • Python 3.12+
  • PostgreSQL
  • uv (Python package & environment manager), installable via pip install uv

Quick Start

  1. Clone the project
git clone git@github.com:isoftstone-data-intelligence-ai/efflux-backend.git
cd efflux-backend
  1. Install uv
pip install uv
  1. Reload dependencies
uv sync --reinstall
  1. Activate virtual environment
# Activate virtual environment
source .venv/bin/activate   # MacOS/Linux

# Deactivate when needed
deactivate
  1. Configure environment variables
# Copy environment variable template
cp .env.sample .env

# Edit .env file, configure:
# 1. Database connection info (DATABASE_NAME, DATABASE_USERNAME, DATABASE_PASSWORD)
# 2. At least one LLM configuration (e.g., Azure OpenAI, Qwen, Doubao, or Moonshot)
  1. Select the LLM
# Edit core/common/container.py file
# Find the llm registration section, replace with any of the following models (Qwen by default):
# - QwenLlm: Qwen
# - AzureLlm: Azure OpenAI
# - DoubaoLlm: Doubao
# - MoonshotLlm: Moonshot

# Example: Using Azure OpenAI
from core.llm.azure_open_ai import AzureLlm
# ...
llm = providers.Singleton(AzureLlm)
  1. Start PostgreSQL database
# Method 1: If PostgreSQL is installed locally
# Simply start your local PostgreSQL service

# Method 2: Using Docker (example)
docker run -d --name local-postgres \
    -e POSTGRES_DB=your_database_name \
    -e POSTGRES_USER=your_username \
    -e POSTGRES_PASSWORD=your_password \
    -p 5432:5432 \
    postgres

# Note: Ensure database connection info matches the configuration in your .env file
  1. Initialize database
# Create a new version and generate a migration file in alembic/versions
alembic revision --autogenerate -m "initial migration"

# Preview SQL to be executed:
alembic upgrade head --sql

# If preview looks good, execute migration
alembic upgrade head
  1. Initialize LLM template data
# Run initialization script
python scripts/init_llm_templates.py
  1. Start the service
python -m uvicorn main:app --host 0.0.0.0 --port 8000

Acknowledgments

This project utilizes the following excellent open-source projects and technologies:

Thanks to the developers and maintainers of these projects for their contributions to the open-source community.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •