A template implementation of a conversational agent using LangGraph and GPT-4. This agent demonstrates the power of LangGraph for building interactive AI agents with tool integration capabilities.
- Interactive conversational interface
- Tool integration support (including weather and search capabilities)
- Streaming responses for real-time interaction
- Built on LangGraph for efficient agent orchestration
- Easy deployment and integration with Blaxel platform
- Python: 3.10 or later
- UV: An extremely fast Python package and project manager, written in Rust
- Blaxel CLI: Ensure you have the Blaxel CLI installed. If not, install it globally:
curl -fsSL https://raw.githubusercontent.com/beamlit/toolkit/main/install.sh | BINDIR=$HOME/.local/bin sh
- Blaxel login: Login to Blaxel platform
bl login YOUR-WORKSPACE
Clone the repository and install dependencies:
git clone https://github.com/beamlit/template-langgraph-py.git
cd template-langgraph-py
uv sync
Start the development server with hot reloading:
bl serve --hotreload
Note: This command starts the server and enables hot reload so that changes to the source code are automatically reflected.
You can test your agent using the chat interface:
bl chat --local blaxel-agent
Or run it directly with specific input:
bl run agent blaxel-agent --local --data '{"input": "What is the weather in Paris?"}'
When you are ready to deploy your application:
bl deploy
This command uses your code and the configuration files under the .blaxel
directory to deploy your application.
- src/main.py - Application entry point
- src/agent.py - Core agent implementation with LangGraph integration
- src/server/ - Server implementation and routing
- pyproject.toml - UV package manager configuration
- blaxel.toml - Blaxel deployment configuration
This project is licensed under the MIT License. See the LICENSE file for more details.