A modern, locally-hosted AI chat application built with Quarkus, Ollama, and LangChain4j. This application provides a web-based interface for interacting with local Large Language Models through Ollama.
- 🤖 Real-time chat with AI using local language models
- 🔒 Privacy-focused (all processing happens on your machine)
- 💬 Modern, responsive UI with typing animations
- 🔄 WebSocket support for instant communication
- 🌐 RESTful API endpoints as fallback
- 💾 Chat session management
- 🎨 Sleek design with Tailwind CSS
- Java 21
- Maven 3.8+
- Ollama - Local LLM serving platform
- An LLM model downloaded through Ollama (e.g., llama2)
First, install Ollama by following the official instructions at ollama.ai.
After installing Ollama, pull a language model (this example uses llama2):
ollama pull llama2Ensure Ollama is running and start the model service:
ollama run llama2You can keep this terminal open while running the application. The model will be available for API calls on port 11434.
./mvnw quarkus:devThis will start the application in development mode. The chat interface will be available at http://localhost:8080.
The application can be configured in src/main/resources/application.properties:
# Application configuration
quarkus.application.name=AI Chat Assistant
quarkus.http.port=8080
# Ollama configuration
ollama.base.url=http://localhost:11434 # URL where Ollama is running
ollama.model=llama2 # Default model to use
# Logging configuration
quarkus.log.console.enable=true
quarkus.log.console.format=%d{HH:mm:ss} %-5p [%c{2.}] (%t) %s%e%n
quarkus.log.level=INFO- Open your browser and navigate to http://localhost:8080
- The chat interface will automatically create a new session
- Type your message in the input field and press Enter or click the send button
- The AI will respond in real-time with a typing animation
You can use the "New Chat" button in the top-right corner to clear the current conversation and start a fresh chat session.
If you want to integrate with the chat assistant programmatically:
POST /api/chat
Returns: {"sessionId":"uuid-string"}
POST /api/chat/{sessionId}/message
Request body:
{
"content": "Your message here",
"sender": "user"
}DELETE /api/chat/{sessionId}
For real-time communication:
const socket = new WebSocket(`ws://localhost:8080/chat-socket/${sessionId}`);
// Send a message
socket.send(JSON.stringify({
"content": "Your message here",
"sender": "user"
}));
// Listen for responses
socket.onmessage = (event) => {
const message = JSON.parse(event.data);
console.log(message.content);
};