AI Code Assistant powered by your own LLM models
Veridia is a modern, self-hosted AI client that connects to your local LLM models via LM Studio. It provides intelligent code analysis, generation, and development assistance while keeping your data completely private.
- 🔒 100% Private: Your code never leaves your machine
- 🚀 Modern UI: Clean React interface with Material-UI
- ⚡ Fast API: Python FastAPI backend for optimal performance
- 🎯 Code-Focused: Specialized for development workflows
- 🔧 Easy Setup: Simple configuration with LM Studio
- 📱 Responsive: Works on desktop and mobile browsers
- Python 3.8+
- Node.js 16+
- LM Studio (Download here)
- Download and install LM Studio
- Download a recommended model (see Recommended Models)
- Start LM Studio and load your model
- Enable the API server (default:
http://localhost:1234)
# Clone the repository
git clone https://github.com/nhexen/veridia.git
cd veridia
# Install backend dependencies
cd backend
pip install -r requirements.txt
# Install frontend dependencies
cd ../frontend
npm install# Start the backend (from project root)
cd backend
uvicorn main:app --reload
# Start the frontend (in another terminal)
cd frontend
npm start🎉 Veridia is now running!
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000
| Model | Size | Best For | Download |
|---|---|---|---|
| DeepSeek Coder V2 | 16B | General coding, multiple languages | Hugging Face |
| Code Llama | 7B-34B | Python, JavaScript, debugging | Meta AI |
| Phind CodeLlama | 34B | Complex problem solving | Hugging Face |
| WizardCoder | 15B | Code explanation, refactoring | Hugging Face |
| Model | Size | Best For | Download |
|---|---|---|---|
| Mixtral 8x7B | 8x7B | Balanced performance | Hugging Face |
| Llama 2 | 7B-70B | General purpose | Meta AI |
- Open LM Studio
- Go to Developer tab
- Start the server on
localhost:1234 - Note your model name
Update backend/main.py with your model details:
LM_STUDIO_API_URL = "http://localhost:1234/v1/chat/completions"
# Update the model name in the generate_code function
"model": "your-model-name-here"veridia/
├── backend/ # FastAPI backend
│ ├── main.py # API endpoints
│ └── requirements.txt # Python dependencies
├── frontend/ # React frontend
│ ├── src/
│ │ ├── App.js # Main component
│ │ └── index.js # Entry point
│ ├── public/ # Static files
│ └── package.json # Node dependencies
├── .github/ # GitHub configuration
└── README.md # This file
cd backend
# Install in development mode
pip install -e .
# Run with auto-reload
uvicorn main:app --reload --host 0.0.0.0 --port 8000cd frontend
# Start development server
npm start
# Build for production
npm run build- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- LM Studio - For the excellent local LLM runtime
- FastAPI - For the modern Python web framework
- React - For the powerful frontend library
- FontAwesome - For the beautiful component library
Made for developers who value privacy and control over their AI tools.