AI Agent Functions for Data Processing with Dual-Mode Support
FleetFluid is a Python library that simplifies data transformation by letting you use AI-powered functions without writing them from scratch. Instead of building functions, you invoke ready-made, agent-based functions that handle tasks like text cleaning, information extraction, translation, labeling, anonymization, and moreβjust by specifying what you need in natural language.
FleetFluid supports two execution modes:
- π Open Source Mode: Uses PydanticAI agents directly on your machine
- βοΈ Cloud Mode: Uses cloud computation for enterprise-grade performance and reliability
The same Python interface works in both modes - just change the initialization to switch between them!
pip install fleetfluidfrom fleetfluid.core import FleetFluid
# Initialize with PydanticAI (open source mode)
ff = FleetFluid(
model="openai:gpt-4",
temperature=0.7,
max_tokens=1000
)
# AI Transformation
result = ff.ai("Rewrite this in a more technical tone", "The data processing pipeline needs optimization.")
print(result)
# Multi-label Classification
result = ff.label("Database query is slow", ["Performance Issue", "Feature Request", "Bug Report"], multiple=True)
print(f"Labels: {result.labels}")
print(f"Confidence: {result.confidence_scores}")from fleetfluid.core import FleetFluid
# Initialize with API key (cloud mode)
ff = FleetFluid(
api_key="your_premium_api_key",
api_endpoint="https://api.fleetfluid.io"
)
# Same interface, different backend!
result = ff.label("Database query is slow", ["Performance Issue", "Feature Request", "Bug Report"])
print(f"Label: {result.label}")# Set environment variables
export FLEETFLUID-API-KEY="your_api_key"
export FLEETFLUID_API_ENDPOINT="https://api.fleetfluid.io"# Automatically detects cloud mode
ff = FleetFluid() # No parameters needed!You can switch between modes at runtime:
# Start in open source mode
ff = FleetFluid(model="openai:gpt-4")
# Switch to cloud mode
ff.switch_to_cloud_mode("new_api_key", "https://api.fleetfluid.io")
# Switch back to open source mode
ff.switch_to_open_source_mode("anthropic:claude-3-sonnet", temperature=0.3)# Basic initialization
ff = FleetFluid()
# Custom model and parameters
ff = FleetFluid(
model="anthropic:claude-3-sonnet",
temperature=0.7,
max_tokens=1000,
top_p=0.9
)# Explicit configuration
ff = FleetFluid(
api_key="your_api_key",
api_endpoint="https://api.fleetfluid.io"
)
# Environment-based
ff = FleetFluid() # Uses FLEETFLUID-API-KEY env varAll methods work identically in both modes:
Label text using AI with structured output.
# Single label
result = ff.label("Hello world", ["greeting", "statement", "question"])
print(f"Label: {result.label}")
print(f"Confidence: {result.confidence}")
# Multiple labels
result = ff.label("Hello world", ["greeting", "statement", "question"], multiple=True)
print(f"Labels: {result.labels}")
print(f"Confidence Scores: {result.confidence_scores}")Apply AI transformation to data.
result = ff.ai("Make this more formal", "hey there, what's up?")
print(result) # "Hello, how are you doing?"Extract specific information from text.
skills = ff.extract("skills", "Python developer with ML experience")
print(skills) # ["Python", "machine learning"]Anonymize personal information.
anonymized = ff.anonymize("My name is John, email: john@example.com")
print(anonymized) # "My name is [NAME], email: [EMAIL]"Generate descriptions from features.
description = ff.describe(
{"color": "blue", "size": "large"},
style="marketing"
)
print(description) # "A stunning large blue item..."All methods have async counterparts:
# Async versions for use in async contexts
result = await ff.label_async("Hello world", ["greeting", "statement"])
result = await ff.ai_async("Make formal", "hey there")
result = await ff.extract_async("skills", "Python developer")FleetFluid uses the Strategy pattern to seamlessly switch between execution backends:
- Abstract Base Classes: Define interfaces for all operations
- Open Source Implementations: Use PydanticAI agents directly
- Cloud Implementations: Wrap REST API calls
- Dynamic Resolution: Methods delegate to appropriate implementation
- Constructor Parameters (highest priority)
- Environment Variables
- Default Values (lowest priority)
# Constructor overrides environment
ff = FleetFluid(model="openai:gpt-4") # Always open source mode
# Environment used when no constructor params
ff = FleetFluid() # Uses FLEETFLUID-API-KEY if set| Variable | Description | Default |
|---|---|---|
FLEETFLUID-API-KEY |
API key for cloud mode | None |
FLEETFLUID_API_ENDPOINT |
API endpoint for cloud mode | https://api.fleetfluid.io |
All PydanticAI parameters are supported:
model: Model identifiertemperature: Creativity level (0.0-1.0)max_tokens: Maximum response lengthtop_p: Nucleus sampling parameterfrequency_penalty: Frequency penaltypresence_penalty: Presence penalty
api_key: Your premium API keyapi_endpoint: Custom API endpoint (optional)
try:
result = ff.label("Hello world", ["greeting", "statement"])
except RuntimeError as e:
if "API" in str(e):
print("Cloud mode error - check your key and endpoint")
elif "Open Source" in str(e):
print("Open source mode error - check your model configuration")
else:
print(f"General error: {e}")- Open Source Mode: API keys stay on your machine
- Cloud Mode: Uses Bearer token authentication
- HTTPS: All API communications are encrypted
- No Data Logging: Your data is not stored or logged
- Python 3.8+
- PydanticAI (for open source mode)
- httpx (for cloud mode)
- API key for your chosen model provider (open source mode) or FleetFluid (cloud mode)
See the test.py file for comprehensive testing and examples of the dual-mode functionality.
| Mode | Code | Use Case |
|---|---|---|
| Open Source | FleetFluid(model="gpt-4") |
Development, open source |
| Open Source | FleetFluid(model="claude-3", temperature=0.7) |
Custom AI parameters |
| Cloud | FleetFluid(api_key="key") |
Production, enterprise |
| Cloud | FleetFluid(api_key="key", api_endpoint="https://api.fleetfluid.io") |
Custom API endpoint |
| Auto | FleetFluid() |
Uses environment variables |
| Function | Description | Example |
|---|---|---|
ff.label(text, labels) |
Single label classification | ff.label("Hello", ["greeting", "statement"]) |
ff.label(text, labels, multiple=True) |
Multiple label classification | ff.label("Hello", ["greeting", "statement"], multiple=True) |
ff.ai(prompt, data) |
AI transformation | ff.ai("Make formal", "hey there") |
ff.extract(type, text) |
Information extraction | ff.extract("skills", "Python developer") |
ff.anonymize(text) |
Text anonymization | ff.anonymize("My name is John") |
ff.describe(features, style) |
Feature description | ff.describe({"color": "blue"}, "marketing") |
# For cloud mode
export FLEETFLUID-API-KEY="your_key"
export FLEETFLUID_API_ENDPOINT="https://api.fleetfluid.io"
# For open source mode (standard AI provider keys)
export OPENAI_API_KEY="your_openai_key"
export ANTHROPIC_API_KEY="your_anthropic_key"- Constructor Parameters (highest)
- Environment Variables
- Default Values (lowest)
Pro Tip: api_key parameter always wins over model parameter!
MIT License. See LICENSE file for details.
- Open Source Users: GitHub Issues
- Premium Users: Contact support@fleetfluid.com