Skip to content

feat(wren-ai-service): create Streamlit UI for configuring LLM models #1584

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 60 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
60 commits
Select commit Hold shift + click to select a range
da42d8d
fix: add fmt.Scanln for debugging missing error report
yichieh-lu Apr 10, 2025
4584b05
Merge branch 'main' into main
cyyeh Apr 10, 2025
462554d
Merge remote-tracking branch 'upstream/main'
yichieh-lu Apr 10, 2025
96667ab
Merge branch 'main' of https://github.com/yichieh-lu/WrenAI
yichieh-lu Apr 10, 2025
a275e42
Remove redundant code
yichieh-lu Apr 11, 2025
c96f296
Add initial config.grok.yaml to support deploymen
yichieh-lu Apr 12, 2025
e7edbd3
Merge remote-tracking branch 'upstream/main'
yichieh-lu Apr 14, 2025
398a14c
Remove redundant engine declaration
yichieh-lu Apr 14, 2025
e090cf0
Remove incorrect settings from config_examples YAML files
yichieh-lu Apr 14, 2025
a39d71f
Merge remote-tracking branch 'upstream/main'
yichieh-lu Apr 16, 2025
d59c6bf
Merge remote-tracking branch 'upstream/main'
yichieh-lu Apr 21, 2025
1168301
Draft streamlit_ui
yichieh-lu Apr 21, 2025
c70f183
Build a testing streamlit_ui
yichieh-lu Apr 23, 2025
a261c3b
Extract constants to constants.py
yichieh-lu Apr 23, 2025
0cf0ddd
refactor: extract download_config and load_block from custom_llm_ui.py
yichieh-lu Apr 23, 2025
35a9743
refactor: split load_blocks into load_yaml_list and group_blocks
yichieh-lu Apr 23, 2025
66488b1
refactor: move session state handling to ConfigState class in session…
yichieh-lu Apr 23, 2025
54885d7
style: remove trailing whitespace
yichieh-lu Apr 23, 2025
c6bc142
refactor: extract UI layout and elements to ui_components.py
yichieh-lu Apr 23, 2025
01d3208
Refactor app structure and fix data type issues
yichieh-lu Apr 24, 2025
04edc56
feat: check for duplicate alias names
yichieh-lu Apr 24, 2025
fdb2114
refactor: extract preview and generate YAML UI to ui_components.py
yichieh-lu Apr 24, 2025
7f8daca
feat: add pipeline configuration in ui_components.py
yichieh-lu Apr 24, 2025
494fac4
feat: add dry_run_test.py testing api_key and embedding model with li…
yichieh-lu Apr 25, 2025
181eb6f
Merge remote-tracking branch 'upstream/main'
yichieh-lu Apr 26, 2025
7436303
feat: add validate function ensuring every field can be validated
yichieh-lu Apr 26, 2025
0f8e386
feat: support multiple API keys configuration for different LLMs
yichieh-lu Apr 27, 2025
cf14ae5
feat: support multiple API keys configuration for different LLMs
yichieh-lu Apr 27, 2025
ac81e37
feat: support saving multiple API keys and enable validation for LLM …
yichieh-lu Apr 29, 2025
9d4f0a0
Merge remote-tracking branch 'upstream/main'
yichieh-lu Apr 29, 2025
90dd5a3
fix: ensure config.yaml is downloaded correctly
yichieh-lu Apr 29, 2025
a608d80
fix: ensure config.yaml is downloaded correctly
yichieh-lu Apr 29, 2025
667043b
feat: add api_base support to render_embedder and session state
yichieh-lu Apr 29, 2025
5a675aa
feat: get the latest WrenAI version in constants.py
yichieh-lu Apr 29, 2025
57077c2
feat: add selectbox for users to choose config.example.yaml by LLM pr…
yichieh-lu Apr 30, 2025
54f133d
chore: improve code comments and docstrings across UI components
yichieh-lu Apr 30, 2025
cad7459
chore: add initial requirements.txt with ui dependencies
yichieh-lu Apr 30, 2025
e9639e4
Merge remote-tracking branch 'upstream/main'
yichieh-lu Apr 30, 2025
05aac3d
chore: updates requirements.txt with ui dependencies
yichieh-lu May 1, 2025
f1b3e45
chore: improve code comments and docstrings across UI components
yichieh-lu May 1, 2025
6533430
feat: add Finished.setting to close Streamlit UI, continue CLI setup,…
yichieh-lu May 2, 2025
366db9f
feat: add Streamlit UI Dockerfile and implement 'custom' launch option
yichieh-lu May 2, 2025
9ce5946
refactor: decouple RunStreamlitUIContainer logic from launch.go and d…
yichieh-lu May 2, 2025
6bed2b4
feat: add 'custom' mode to launch Streamlit UI and ensure config file…
yichieh-lu May 5, 2025
6a068d1
Merge remote-tracking branch 'upstream/main'
yichieh-lu May 5, 2025
d40ae29
chore: rename streamlit-ui to providers-setup and move to tools direc…
yichieh-lu May 6, 2025
011ce4b
feat: rewrite Dockerfile to use Poetry and remove requirements.txt
yichieh-lu May 6, 2025
bb8dab6
chore: rename streamlit-ui to providers-setup
yichieh-lu May 6, 2025
b4d5b4e
update dependencies
cyyeh May 6, 2025
6255290
Merge remote-tracking branch 'upstream/main'
yichieh-lu May 6, 2025
d0d0360
Merge branches 'main' and 'main' of https://github.com/yichieh-lu/WrenAI
yichieh-lu May 6, 2025
8341378
chore: fix Streamlit UI layout
yichieh-lu May 6, 2025
eeedb81
Merge remote-tracking branch 'upstream/main'
yichieh-lu May 7, 2025
d9bd650
Merge branch 'main' into main
cyyeh May 9, 2025
9f830b1
Merge branch 'main' into main
cyyeh May 9, 2025
430c67b
Merge remote-tracking branch 'upstream/main'
yichieh-lu May 13, 2025
fc5ee05
refactor(wren-ai-service): streamline UI components and enhance confi…
yichieh-lu May 13, 2025
efc6529
refactor(wren-ai-service): improve configuration path handling and UI…
yichieh-lu May 13, 2025
095475c
feat(wren-ai-service): add validation for configuration blocks and im…
yichieh-lu May 13, 2025
8f892e6
refactor(wren-ai-service): enhance config extraction and improve UI c…
yichieh-lu May 13, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
58 changes: 58 additions & 0 deletions wren-ai-service/tools/providers-setup/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
# Use official slim Python 3.12 base image
FROM python:3.12.0-slim

# -------------------------------
# System Dependencies for Poetry
# -------------------------------
# Install minimal system packages: curl (for downloading), build tools (for native extensions)
RUN apt-get update && apt-get install -y --no-install-recommends \
curl build-essential gcc \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*

# -------------------------------
# Install Poetry (Python package manager)
# -------------------------------
ENV POETRY_VERSION=1.8.2
ENV POETRY_HOME="/opt/poetry"
ENV PATH="$POETRY_HOME/bin:$PATH"
ENV POETRY_VIRTUALENVS_CREATE=false
# Don't use virtualenvs inside the container

RUN curl -sSL https://install.python-poetry.org | python3 - \
&& ln -s $POETRY_HOME/bin/poetry /usr/local/bin/poetry

# -------------------------------
# Set working directory for app
# -------------------------------
WORKDIR /app

# -------------------------------
# Install Python dependencies via Poetry
# -------------------------------
# Copy only dependency files first to leverage Docker layer caching
COPY pyproject.toml poetry.lock ./
RUN poetry install --no-interaction --no-ansi

# -------------------------------
# Copy remaining app code
# -------------------------------
COPY . .

# -------------------------------
# Environment variables for Streamlit
# -------------------------------
ENV PYTHONUNBUFFERED=1
ENV STREAMLIT_SERVER_HEADLESS=true
ENV STREAMLIT_SERVER_PORT=8501
ENV STREAMLIT_SERVER_ENABLECORS=false

# -------------------------------
# Expose Streamlit port
# -------------------------------
EXPOSE 8501

# -------------------------------
# Default command to run the Streamlit app
# -------------------------------
CMD ["streamlit", "run", "app.py"]
Comment on lines +1 to +58
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Run container as non-root user for security.

The Dockerfile doesn't specify a user, which means the container will run as root by default. This is a security risk.

Add a non-root user and switch to it before running the application:

 COPY . .

+# -------------------------------
+# Create and switch to non-root user
+# -------------------------------
+RUN useradd -m streamlituser
+USER streamlituser
+
 # -------------------------------
 # Environment variables for Streamlit
 # -------------------------------
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
# Use official slim Python 3.12 base image
FROM python:3.12.0-slim
# -------------------------------
# System Dependencies for Poetry
# -------------------------------
# Install minimal system packages: curl (for downloading), build tools (for native extensions)
RUN apt-get update && apt-get install -y --no-install-recommends \
curl build-essential gcc \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
# -------------------------------
# Install Poetry (Python package manager)
# -------------------------------
ENV POETRY_VERSION=1.8.2
ENV POETRY_HOME="/opt/poetry"
ENV PATH="$POETRY_HOME/bin:$PATH"
ENV POETRY_VIRTUALENVS_CREATE=false
# Don't use virtualenvs inside the container
RUN curl -sSL https://install.python-poetry.org | python3 - \
&& ln -s $POETRY_HOME/bin/poetry /usr/local/bin/poetry
# -------------------------------
# Set working directory for app
# -------------------------------
WORKDIR /app
# -------------------------------
# Install Python dependencies via Poetry
# -------------------------------
# Copy only dependency files first to leverage Docker layer caching
COPY pyproject.toml poetry.lock ./
RUN poetry install --no-interaction --no-ansi
# -------------------------------
# Copy remaining app code
# -------------------------------
COPY . .
# -------------------------------
# Environment variables for Streamlit
# -------------------------------
ENV PYTHONUNBUFFERED=1
ENV STREAMLIT_SERVER_HEADLESS=true
ENV STREAMLIT_SERVER_PORT=8501
ENV STREAMLIT_SERVER_ENABLECORS=false
# -------------------------------
# Expose Streamlit port
# -------------------------------
EXPOSE 8501
# -------------------------------
# Default command to run the Streamlit app
# -------------------------------
CMD ["streamlit", "run", "app.py"]
# Use official slim Python 3.12 base image
FROM python:3.12.0-slim
# -------------------------------
# System Dependencies for Poetry
# -------------------------------
# Install minimal system packages: curl (for downloading), build tools (for native extensions)
RUN apt-get update && apt-get install -y --no-install-recommends \
curl build-essential gcc \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/*
# -------------------------------
# Install Poetry (Python package manager)
# -------------------------------
ENV POETRY_VERSION=1.8.2
ENV POETRY_HOME="/opt/poetry"
ENV PATH="$POETRY_HOME/bin:$PATH"
ENV POETRY_VIRTUALENVS_CREATE=false
# Don't use virtualenvs inside the container
RUN curl -sSL https://install.python-poetry.org | python3 - \
&& ln -s $POETRY_HOME/bin/poetry /usr/local/bin/poetry
# -------------------------------
# Set working directory for app
# -------------------------------
WORKDIR /app
# -------------------------------
# Install Python dependencies via Poetry
# -------------------------------
# Copy only dependency files first to leverage Docker layer caching
COPY pyproject.toml poetry.lock ./
RUN poetry install --no-interaction --no-ansi
# -------------------------------
# Copy remaining app code
# -------------------------------
COPY . .
# -------------------------------
# Create and switch to non-root user
# -------------------------------
RUN useradd -m streamlituser
USER streamlituser
# -------------------------------
# Environment variables for Streamlit
# -------------------------------
ENV PYTHONUNBUFFERED=1
ENV STREAMLIT_SERVER_HEADLESS=true
ENV STREAMLIT_SERVER_PORT=8501
ENV STREAMLIT_SERVER_ENABLECORS=false
# -------------------------------
# Expose Streamlit port
# -------------------------------
EXPOSE 8501
# -------------------------------
# Default command to run the Streamlit app
# -------------------------------
CMD ["streamlit", "run", "app.py"]
🤖 Prompt for AI Agents (early access)
In wren-ai-service/tools/providers-setup/Dockerfile lines 1 to 58, the container
runs as root by default, which is a security risk. To fix this, add commands to
create a non-root user and group, set appropriate ownership of the /app
directory, and switch to this user before the CMD instruction. This ensures the
application runs with limited privileges inside the container.

89 changes: 89 additions & 0 deletions wren-ai-service/tools/providers-setup/app.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
from config_loader import load_config_yaml_blocks, group_blocks
from session_state import ConfigState
from ui_components import (
render_llm_config,
render_embedder_config,
render_import_yaml,
render_pipeline_config,
render_preview,
render_apikey,
render_generate_button
)
import streamlit as st

# Set Streamlit page layout
st.set_page_config(
layout="wide", # Use a wide layout for better horizontal space
initial_sidebar_state="expanded" # Expand sidebar by default
)

# Load and group configuration blocks from YAML
yaml_list = load_config_yaml_blocks()
blocks = group_blocks(yaml_list)

# Retrieve individual configuration sections
llm_block = blocks.get("llm", {})
embedder_block = blocks.get("embedder", {})
document_store_block = blocks.get("document_store", {})
engine_blocks = blocks.get("engine", [])
pipeline_block = blocks.get("pipeline", {})
settings_block = blocks.get("settings", {})


# Validate required blocks (type + content)
missing_blocks = []

if not isinstance(llm_block, dict) or not llm_block:
missing_blocks.append("LLM")
if not isinstance(embedder_block, dict) or not embedder_block:
missing_blocks.append("Embedder")
if not isinstance(document_store_block, dict) or not document_store_block:
missing_blocks.append("Document Store")
if not isinstance(pipeline_block, dict) or not pipeline_block:
missing_blocks.append("Pipeline")

if missing_blocks:
st.warning(
f"⚠️ Missing or empty configuration blocks: {', '.join(missing_blocks)}. "
"Default values will be used where applicable."
)

# Initialize session state with default or imported config values
ConfigState.init(llm_block, embedder_block, document_store_block, pipeline_block)

# ----------------------
# Streamlit UI rendering
# ----------------------
st.title("Custom Provider Config Generator")

# Layout: two columns – left for inputs, right for preview/export
col1, col2 = st.columns([1.5, 1])

with col1:

# API key input section
st.subheader("API_KEY Configuration")
render_apikey()

# Upload and parse YAML file into session state
st.subheader("LLM Configuration")
render_import_yaml()

# LLM model configuration UI
render_llm_config()

# Embedding model configuration UI
st.subheader("Embedder Configuration")
render_embedder_config()

# Pipeline flow configuration UI
st.subheader("Pipeline Configuration")
render_pipeline_config()

# Generate config.yaml and save configuration button
render_generate_button(engine_blocks, settings_block)

with col2:
# Final preview and export of the combined configuration as YAML
render_preview(engine_blocks, settings_block)

127 changes: 127 additions & 0 deletions wren-ai-service/tools/providers-setup/config_loader.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,127 @@
import requests
import yaml
from session_state import ConfigState
from pathlib import Path
import constants as cst
from typing import Any, Dict, List
import streamlit as st

def load_config_yaml_blocks() -> List[Dict[str, Any]]:
"""
Load the config.yaml from local disk if available;
otherwise, fetch it from the GitHub URL without downloading it.
"""
CONFIG_IN_PATH = cst.get_config_path()
if CONFIG_IN_PATH.exists():
try:
return load_yaml_list(CONFIG_IN_PATH)
except Exception as e:
st.error(f"❌ Failed to parse local config.yaml: {e}")
return []
else:
return fetch_yaml_from_url(cst.CONFIG_URL)

def load_selected_example_yaml(selected_example: str) -> List[Dict[str, Any]]:
"""
Fetch a selected YAML example file from GitHub and return it as a list of blocks.
"""
selected_url = cst.CONFIG_EXAMPLES_SELECTED_URL + selected_example
try:
response = requests.get(selected_url, timeout=cst.REQUEST_TIMEOUT)
response.raise_for_status()
return list(yaml.safe_load_all(response.text))
except requests.RequestException as e:
st.error(f"❌ Error loading config from GitHub: {e}")
return []

def fetch_yaml_from_url(url: str) -> List[Dict[str, Any]]:
"""
Fetch and parse a YAML list from a remote URL.
Returns an empty list if fetch or parsing fails.
"""
try:
response = requests.get(url, timeout=cst.REQUEST_TIMEOUT)
response.raise_for_status()
config_list = list(yaml.safe_load_all(response.text))

if not config_list:
raise ValueError(f"⚠️ Received empty YAML content from: {url}")

return config_list

except (requests.RequestException, ValueError, yaml.YAMLError) as e:
st.error(f"❌ Error loading config from {url}: {e}")
return []

def extract_config_blocks(config_list: List[Dict[str, Any]]) -> Dict[str, Any]:
"""
Extract the first block of each type from the config list.
"""
grouped = group_blocks(config_list)

def get_first_or_empty(key: str) -> Dict[str, Any]:
val = grouped.get(key, {})
if isinstance(val, list):
return val[0] if val else {}
return val or {}

return {
"llm": get_first_or_empty("llm"),
"embedder": get_first_or_empty("embedder"),
"document_store": get_first_or_empty("document_store"),
"pipeline": get_first_or_empty("pipeline"),
}

def load_yaml_list(path: Path) -> List[Dict[str, Any]]:
"""
Load and parse all YAML documents from a file path.
"""
with path.open("r", encoding="utf-8") as f:
return list(yaml.safe_load_all(f))

def group_blocks(blocks: List[Dict[str, Any]]) -> Dict[str, Any]:
"""
Group YAML blocks by their 'type' field.
If multiple blocks share the same type, they are stored as a list.
"""
save_blocks = {}
for block in blocks:
key = block.get("type") or ("settings" if "settings" in block else None)
if not key:
continue
if key in save_blocks:
if isinstance(save_blocks[key], list):
save_blocks[key].append(block)
else:
save_blocks[key] = [save_blocks[key], block]
else:
save_blocks[key] = block
return save_blocks
Comment on lines +82 to +99
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Inefficient list–promotion logic in group_blocks

When encountering the second block of a given type, you wrap the first block in a list and append the new one.
If there is a third block, you hit the else path and replace the list with [old_list, block], nesting lists unexpectedly.

if key in save_blocks:
    if isinstance(save_blocks[key], list):
        save_blocks[key].append(block)      # ✅ good
    else:
        save_blocks[key] = [save_blocks[key], block]   # ❌ fine for 2nd, bad for 3rd+

After conversion to list once, the second branch should never execute again; guard accordingly:

-        else:
-            save_blocks[key] = [save_blocks[key], block]
+        else:
+            save_blocks[key] = [save_blocks[key], block]

and move the first branch’s check outside the loop.
Alternatively, initialise with defaultdict(list).

🤖 Prompt for AI Agents (early access)
In wren-ai-service/tools/providers-setup/config_loader.py around lines 74 to 91,
the current logic for grouping blocks by type incorrectly nests lists when more
than two blocks share the same type. To fix this, ensure that once a key's value
is converted to a list on encountering the second block, subsequent blocks are
always appended to that list without re-wrapping. Refactor the code to check if
the key exists and if its value is already a list before appending, or
alternatively use a defaultdict(list) to simplify and avoid nested lists.


def fetch_example_yaml_filenames() -> List[str]:
"""
Fetch the filenames of all .yaml example configs from the GitHub directory
(does not download the content).
"""
try:
response = requests.get(cst.CONFIG_EXAMPLES_URL, timeout=cst.REQUEST_TIMEOUT)
response.raise_for_status()
file_list = response.json()
return [f["name"] for f in file_list if f["name"].endswith(".yaml")]
except requests.RequestException as e:
st.error(f"Error fetching config example filenames: {e}")
return []

def apply_config_blocks(config_blocks: List[Dict[str, Any]]):
"""
Group and apply config blocks by updating the Streamlit session state via ConfigState.
"""
grouped = extract_config_blocks(config_blocks)

ConfigState.init(
grouped["llm"],
grouped["embedder"],
grouped["document_store"],
grouped["pipeline"],
force=True
)
87 changes: 87 additions & 0 deletions wren-ai-service/tools/providers-setup/constants.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,87 @@
from pathlib import Path
import os
import requests

# -------------------------------
# Fetch Latest Release Version
# -------------------------------

def get_latest_config_version():
"""
Retrieve the latest release tag from the WrenAI GitHub repository.

Returns:
str: The latest version tag (e.g., "0.20.2") if successful,
or "main" as a fallback if the request fails.
"""
url = "https://api.github.com/repos/Canner/WrenAI/releases/latest"
try:
response = requests.get(url, timeout=10)
if response.status_code == 200:
data = response.json()
return data["tag_name"]
else:
print(f"Failed to get latest release: {response.status_code}")
except Exception as e:
print(f"Error fetching latest config version: {e}")

return "main" # Fallback to 'main' branch if the request fails
Comment on lines +17 to +28
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Unauthenticated GitHub API call risks 403 rate-limit

/releases/latest without a token is limited to 60 requests / hour per IP.
Inside Docker-based CI or many users behind NAT this can quickly return 403, causing the app to fall back to main and possibly fetch incompatible config schemas.

Consider:

  1. Reading a GITHUB_TOKEN from env and adding an Authorization header when present.
  2. Caching the result in a file or environment variable to avoid repeated calls during a single run.
🤖 Prompt for AI Agents (early access)
In wren-ai-service/tools/providers-setup/constants.py around lines 17 to 28, the
GitHub API call to fetch the latest release is unauthenticated, risking 403 rate
limits. Fix this by reading a GITHUB_TOKEN from the environment and, if present,
include it as an Authorization header in the requests.get call. Additionally,
implement caching of the fetched tag_name in a file or environment variable to
avoid repeated API calls during a single run.



# -------------------------------
# Constants for Config Loading
# -------------------------------

CONFIG_VERSION = get_latest_config_version()

# URL for the default config YAML (used if no local config is found)
CONFIG_URL = f"https://raw.githubusercontent.com/Canner/WrenAI/{CONFIG_VERSION}/docker/config.example.yaml"

# GitHub API URL to list config examples (only metadata)
CONFIG_EXAMPLES_URL = (
f"https://api.github.com/repos/Canner/WrenAI/contents/wren-ai-service/docs/config_examples?ref={CONFIG_VERSION}"
)

# Base URL to fetch individual example YAML files by filename
CONFIG_EXAMPLES_SELECTED_URL = (
f"https://raw.githubusercontent.com/Canner/WrenAI/{CONFIG_VERSION}/wren-ai-service/docs/config_examples/"
)

# -------------------------------
# Local Config Paths
# -------------------------------

volume_app_data = Path("/app/data")

# Global HTTP request timeout in seconds
REQUEST_TIMEOUT = 10

def get_config_done_path():
# Docker environment: mounted config.done
docker_path = volume_app_data / "config.done"
local_path = Path.home() / ".wrenai" / "config.done"

if docker_path.exists():
return docker_path
else:
return local_path

def get_config_path():
# Docker environment: mounted config.yaml
docker_path = volume_app_data / "config.yaml"
local_path = Path.home() / ".wrenai" / "config.yaml"

if docker_path.exists():
return docker_path
else:
return local_path

# Path to the .env file
def get_env_path():
docker_path = volume_app_data / ".env"
local_path = Path.home() / ".wrenAI" / ".env"

Comment on lines +82 to +83
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Inconsistent home directory naming can cause conflicts

The code uses two different directory names (.wrenai for config files vs .wrenAI for .env) which can cause issues on case-sensitive file systems and lead to confusion.

Standardize on a single directory name:

def get_env_path():
    docker_path = volume_app_data / ".env"
-   local_path = Path.home() / ".wrenAI" / ".env"
+   local_path = Path.home() / ".wrenai" / ".env"

    if docker_path.exists():
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
local_path = Path.home() / ".wrenAI" / ".env"
def get_env_path():
docker_path = volume_app_data / ".env"
local_path = Path.home() / ".wrenai" / ".env"
if docker_path.exists():
return docker_path
return local_path
🤖 Prompt for AI Agents (early access)
In wren-ai-service/tools/providers-setup/constants.py at lines 82 to 83, the
directory name used for the home path is ".wrenAI" which is inconsistent with
the ".wrenai" used elsewhere for config files. To fix this, standardize the
directory name by changing ".wrenAI" to ".wrenai" so that the home directory
path is consistent across the codebase and avoids conflicts on case-sensitive
file systems.

if docker_path.exists():
return docker_path
else:
return local_path
Loading
Loading