Build AI-powered applications with the Mastra framework. Everything you need to quickly develop and deploy AI agents and workflows.
Mastra AI is a powerful framework for building AI applications with agents, tools, and workflows. This starter kit demonstrates how to:
- Create AI agents using LLMs
- Build custom tools for your agents
- Define multi-step workflows
- Set up a development environment
mastra-starter/
├── src/
│ └── mastra/
│ ├── agents/ # AI agent definitions
│ ├── tools/ # Custom tools for agents
│ ├── workflows/ # Multi-step workflows
│ └── index.ts # Main Mastra configuration
├── package.json # Project dependencies
├── tsconfig.json # TypeScript configuration
└── .env.example # Environment variables template
- Pre-configured Agents: Agents powered by OpenAI models
- Custom Tools: Ready-to-use tools for common tasks like weather data retrieval
- Workflow System: Define complex multi-step processes with error handling
- TypeScript Support: Full type safety with TypeScript
- Development Server: Built-in development server with hot reloading
- Dashboard: Interactive interface for testing and debugging.
Quick Start: Get up and running with minimal setup.
Best Practices: Learn from pre-built examples following best practices.
Extensible: Easily add new agents, tools, and workflows.
Modern Tech Stack: Built with TypeScript, OpenAI, and modern JavaScript.
Production Ready: Structured for both development and production deployment.
- Node.js 22+ (LTS recommended)
- npm or yarn
- OpenAI API key (for the example agent)
-
Clone the repository:
git clone https://github.com/BunsDev/mastra-starter.git cd mastra-starter
-
Set up environment variables:
cp .env.example .env && OPENAI_API_KEY=your_api_key_here
-
Install dependencies and start development server:
yarn && yarn dev
Running yarn dev
or npm run dev
starts the Mastra development server, which:
- Watches for changes in your source files
- Automatically restarts the server when changes are detected
- Provides a dashboard interface to interact with your agents and workflows
- Logs detailed information about agent interactions and workflow executions
Running the development server, opens a local development dashboard at http://localhost:4111
. This dashboard provides:
- A chat interface to interact with your agents
- Workflow execution visualization
- Debug information and logs
- Agent and tool performance metrics
- Direct access to workflow execution details
/api
provides endpoints for:
- Starting workflow executions
- Sending messages to agents
- Checking workflow status
- Retrieving execution results
// Start a weather workflow
fetch('http://localhost:4111/api/workflows/weather-workflow/execute', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
city: 'New York',
}),
})
.then(response => response.json())
.then(data => console.log(data));
- Check out the Mastra Documentation for comprehensive guides
- Explore the Weather agent and workflow as examples.
- Experiment with different OpenAI models.
- Create your own custom tools and agents.
- Community for Blockchain and AI enthusiasts
- Learning Hub with tutorials, guides, and coding resources
- Build in Public with our developer community
- Join Us and be part of the movement
- Innovators collaborate on cutting-edge projects
- Experts share valuable insights and resources
- Newcomers grow through mentorship and hands-on experience
- Communities form around shared technological interests
- Learning Resources for blockchain and AI development
- Collaborative Projects to gain real-world experience
- Networking Events connecting developers with industry leaders
- Technical Workshops on cutting-edge technologies