Skip to content

WouterDurnez/fence

tests

Python Test Status codecov PyPI version Documentation Status Code style: black pre-commit License: MIT Contributor Covenant

🀺 Fence

The Bloat Moat! A lightweight, production-ready library for LLM communication and agentic workflows. Born from the need for something simpler than LangChain, Fence gives you powerful LLM orchestration without the heavyweight dependencies.

Think of it as the Swiss Army knife for LLM interactionsβ€”sharp, reliable, and it won't weigh down your backpack (or your Docker image).


πŸ€” Why Fence?

The short answer: By accident.

The slightly longer answer: LangChain used to be (is?) a pretty big package with a ton of dependencies. Great for PoCs, but in production? Not so much.

The problems we faced:

  • 🐘 It's BIG. Takes up serious space (problematic in Lambda, containers, edge environments)
  • πŸŒ€ It's COMPLEX. Overwhelming for new users, hard to debug in production
  • πŸ’₯ It BREAKS. Frequent breaking changes, version jumps that made us cry

As a result, many developers (especially those in large production environments) started building lightweight, custom solutions that favor stability and robustness over feature bloat.

Enter Fence 🀺

We started building basic components from scratch for our Bedrock-heavy production environment. First came the Link class (wink wink), then templates, then agents... and before we knew it, we had a miniature package that was actually fun to use.

Fence strikes the perfect balance between convenience and flexibility.

Note: Fence isn't trying to replace LangChain for complex PoCs. But if you want a simple, lightweight, production-ready package that's easy to understand and extend, you're in the right place.


πŸ“¦ Installation

pip install fence-llm

That's it. Seriously. No 500MB of transitive dependencies.


πŸš€ Quick Start

Hello World (The Obligatory Example)

from fence.links import Link
from fence.templates.string import StringTemplate
from fence.models.openai import GPT4omini

# Create a link
link = Link(
    model=GPT4omini(),
    template=StringTemplate("Write a haiku about {topic}"),
    name='haiku_generator'
)

# Run it
output = link.run(topic='fencing')['state']
print(output)

Output:

[2024-10-04 17:45:15] [ℹ️ INFO] [links.run:203]              Executing <haiku_generator> Link
Blades flash in the light,
En garde, the dance begins now,
Touchβ€”victory's mine.

Much wow. Very poetry. 🎭


πŸ’ͺ What Can Fence Do?

Fence is built around a few core concepts that work together beautifully:

πŸ€– Multi-Provider LLM Support

Uniform interface across AWS Bedrock (Claude, Nova), OpenAI (GPT-4o), Anthropic, Google Gemini, Ollama, and Mistral. Switch models with a single line change.

πŸ‘‰ See all supported models β†’

πŸ”— Links & Chains

Composable building blocks that combine models, templates, and parsers. Chain them together for complex workflows.

πŸ‘‰ Learn about Links & Chains β†’

πŸ€– Agentic Workflows ⭐

The crown jewel! Production-ready agents using the ReAct pattern:

  • Agent - Classic ReAct with tool use and multi-level delegation
  • BedrockAgent - Native Bedrock tool calling with streaming
  • ChatAgent - Conversational agents for multi-agent systems

πŸ‘‰ Dive into Agents β†’

πŸ”Œ MCP Integration

First-class support for the Model Context Protocol. Connect to MCP servers and automatically expose their tools to your agents.

πŸ‘‰ Explore MCP Integration β†’

🎭 Multi-Agent Systems

Build collaborative agent systems with RoundTable where multiple agents discuss and solve problems together.

πŸ‘‰ Build Multi-Agent Systems β†’

🧠 Memory Systems

Persistent and ephemeral memory backends (DynamoDB, SQLite, in-memory) for stateful conversations.

πŸ‘‰ Configure Memory β†’

πŸ› οΈ Tools & Utilities

Custom tool creation, built-in tools, retry logic, parallelization, output parsers, logging callbacks, and benchmarking.

πŸ‘‰ Explore Tools & Utilities β†’


πŸ“š Documentation


🎯 Examples

Simple Agent with Tools

from fence.agents import Agent
from fence.models.openai import GPT4omini
from fence.tools.math import CalculatorTool

agent = Agent(
    identifier="math_wizard",
    model=GPT4omini(source="demo"),
    tools=[CalculatorTool()],
)

result = agent.run("What is 1337 * 42 + 999?")
print(result)  # Agent thinks, uses calculator, and answers!

BedrockAgent with MCP

from fence.agents.bedrock import BedrockAgent
from fence.mcp.client import MCPClient
from fence.models.bedrock import Claude37Sonnet

# Connect to MCP server
mcp_client = MCPClient(
    transport_type="streamable_http",
    url="https://your-mcp-server.com/mcp"
)

# Create agent with MCP tools
agent = BedrockAgent(
    identifier="mcp_agent",
    model=Claude37Sonnet(region="us-east-1"),
    mcp_clients=[mcp_client],  # Tools auto-registered!
)

result = agent.run("Search for customer data")

Multi-Agent Collaboration

from fence.troupe import RoundTable
from fence.agents import ChatAgent
from fence.models.openai import GPT4omini

# Create specialized agents
detective = ChatAgent(
    identifier="Detective",
    model=GPT4omini(source="roundtable"),
    profile="You are a sharp detective."
)

scientist = ChatAgent(
    identifier="Scientist",
    model=GPT4omini(source="roundtable"),
    profile="You are a forensic scientist."
)

# Let them collaborate
round_table = RoundTable(agents=[detective, scientist])
transcript = round_table.run(
    prompt="A painting was stolen. Let's investigate!",
    max_rounds=3
)

More examples:


🀝 Contributing

We welcome contributions! Whether it's:

  • πŸ› Bug fixes
  • ✨ New features (especially new model providers!)
  • πŸ“ Documentation improvements
  • πŸ§ͺ More tests
  • 🎨 Better examples

Check out CONTRIBUTING.md for guidelines.


πŸ“„ License

MIT License - see LICENSE.txt for details.


πŸ™ Acknowledgments

Inspired by LangChain, built for production, made with ❀️ by developers who got tired of dependency hell.

Now go build something awesome! πŸš€

About

A small package that offers an alternative for LangChain's most used functionality

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Packages

No packages published