Note: This repository is a fork of superglue that integrates OpenRouter support. Full credit goes to the original creators at superglue-ai for their amazing work.
superglue is a self-healing open source data connector. You can deploy it as a proxy between you and any complex / legacy APIs and always get the data that you want in the format you expect.
Here's how it works: You define your desired data schema and provide basic instructions about an API endpoint (like "get all issues from jira"). Superglue then does the following:
- Automatically generates the API configuration by analyzing API docs.
- Handles pagination, authentication, and error retries.
- Transforms response data into the exact schema you want using JSONata expressions.
- Validates that all data coming through follows that schema, and fixes transformations when they break.
If you're spending a lot of time writing code connecting to weird APIs, fumbling with custom fields in foreign language ERPs, mapping JSONs, extracting data from compressed CSVs sitting on FTP servers, and making sure your integrations don't break when something unexpected comes through, superglue might be for you.
flowchart LR
subgraph Input[data sources]
A1[APIs]
A2[files]
A3[legacy systems]
end
subgraph Process[data transformation]
T1[superglue engine]
end
subgraph Output[destination]
D1[your system]
end
Input --> Process
Process --> Output
%% Styling
classDef sourceStyle fill:#f9f,stroke:#333,stroke-width:2px
classDef processStyle fill:#bbf,stroke:#333,stroke-width:2px
classDef outputStyle fill:#bfb,stroke:#333,stroke-width:2px
class Input sourceStyle
class Process processStyle
class Output outputStyle
-
Sign up for early access to the hosted version of superglue at superglue.cloud
-
Install the superglue js/ts client:
npm install @superglue/client- Configure your first api call:
import { SuperglueClient } from "@superglue/client";
const superglue = new SuperglueClient({
apiKey: "************"
});
const config = {
urlHost: "https://futuramaapi.com",
urlPath: "/graphql",
instruction: "get all characters from the show",
responseSchema: {
type: "object",
properties: {
characters: {
type: "array",
items: {
type: "object",
properties: {
name: { type: "string" },
species: { type: "string", description: "lowercased" }
}
}
}
}
}
};
const result = await superglue.call({endpoint: config});
console.log(JSON.stringify(result.data, null, 2));
/*
output:
{
"characters": [
{
"name": "Phillip J. Fry",
"species": "human"
},
...
]
}
*/Run your own instance of superglue using Docker:
- Pull the Docker image:
docker pull superglueai/superglue-
Create a
.envby copying the.env.examplefile at the root -
Start the server:
docker run -d \
--name superglue \
--env-file .env \
-p 3000:3000 \
-p 3001:3001 \
superglueai/superglue- Verify the installation:
curl http://localhost:3000/health
> OK
# or open http://localhost:3000/?token=your-auth-token- Open the dashboard to create your first configuration:
http://localhost:3001/- run your first call:
npm install @superglue/clientimport { SuperglueClient } from "@superglue/client";
const superglue = new SuperglueClient({
endpoint: "http://localhost:3000",
apiKey: "your-auth-token"
});
// either via config object
const config = {
urlHost: "https://futuramaapi.com",
urlPath: "/graphql",
instruction: "get all characters from the show",
};
const result = await superglue.call({endpoint: config});
// or via the api id if you have already created the endpoint
const result2 = await superglue.call({id: "futurama-api"});
console.log(JSON.stringify(result.data, null, 2));- LLM-Powered Data Mapping: Automatically generate data transformations using large language models
- API Proxy: Intercept and transform API responses in real-time with minimal added latency
- File Processing: Handle various file formats (CSV, JSON, XML) with automatic decompression
- Schema Validation: Ensure data compliance with your specified schemas
- Flexible Authentication: Support for various auth methods including header auth, api keys, oauth, and more
- Smart Pagination: Handle different pagination styles automatically
- Caching & Retry Logic: Built-in caching and configurable retry strategies
For detailed documentation, visit docs.superglue.cloud.
We love contributions! Feel free to open issues for bugs or feature requests.
superglue is GPL licensed. The superglue client SDKs are MIT licensed. See LICENSE for details.
- π¬ Discord: Join our community
- π Issues: GitHub Issues
Superglue now supports using OpenRouter as an alternative to OpenAI for all LLM operations. This allows you to access hundreds of AI models through a single unified API.
To use OpenRouter, add the following environment variables to your .env file:
# Set to true to use OpenRouter instead of OpenAI
USE_OPENROUTER=true
# Your OpenRouter API key (get one at https://openrouter.ai/keys)
OPENROUTER_API_KEY=sk-or-your-key-here
# OpenRouter base URL (usually you don't need to change this)
OPENROUTER_API_BASE_URL=https://openrouter.ai/api/v1
# OpenRouter model to use (e.g., 'anthropic/claude-3.5-sonnet', 'openai/gpt-4o', etc.)
OPENROUTER_MODEL=openai/gpt-4o
# OpenRouter model for schema generation (can be the same as OPENROUTER_MODEL or a faster model)
OPENROUTER_SCHEMA_MODEL=openai/gpt-4o
OpenRouter provides access to a wide range of models from various providers including OpenAI, Anthropic, Google, Mistral, and more. You can view the full list of available models on the OpenRouter Models page.
Popular models include:
openai/gpt-4o- OpenAI's GPT-4oanthropic/claude-3.5-sonnet- Anthropic's Claude 3.5 Sonnetgoogle/gemini-1.5-pro- Google's Gemini 1.5 Promistral/mixtral-8x7b- Mistral's Mixtral 8x7B
- Access to multiple models - Use the best model for your specific use case
- Fallback routing - Automatically retry with alternative models if your primary choice fails
- Cost management - Compare pricing across providers
- Unified API - No need to change your code to switch between models
You can easily switch between OpenAI and OpenRouter by changing the USE_OPENROUTER environment variable:
- Set
USE_OPENROUTER=trueto use OpenRouter - Set
USE_OPENROUTER=falseto use OpenAI directly
This allows you to switch providers without any code changes.
