This project implements a streaming server for Google's Generative AI using Elysia and the Bun runtime. It provides an API endpoint for streaming text responses from Google's AI models.
- Streaming responses from Google's Generative AI
- Built with Elysia for fast, type-safe API development
- Uses Bun runtime for improved performance
- Supports various assistant roles (function, system, user, assistant, data, tool)
- Bun installed on your system
- A Google AI API key (https://aistudio.google.com/app/apikey)
-
Clone the repository:
git clone https://github.com/DobroslavR/elysia-vercel-ai-sdk.git cd elysia-vercel-ai-sdk -
Install dependencies:
bun install
-
Set up environment variables: Create a
.envfile in the project root and add your Google AI API key:GOOGLE_AI_API_KEY=your_api_key_here PORT=3000
To start the server in development mode:
bun devThe server will start, and you should see a message like:
🦊 Elysia is running at http://localhost:3000
- POST
/stream- Accepts an array of messages in the request body
- Returns a streaming response from the Google AI model
Example request body:
{
"messages": [
{
"role": "user",
"content": "Tell me a joke"
}
]
}@ai-sdk/google: SDK for interacting with Google's AI modelsai: Utility functions from Vercel for AI operationselysia: Fast and flexible web framework for Bun
Contributions are welcome! Please feel free to submit a Pull Request.
This project is open-source and available under the MIT License.