Recursive Language Model patterns for Claude Code — handle massive contexts (10M+ tokens) by treating them as external variables
-
Updated
Jan 26, 2026 - Python
Recursive Language Model patterns for Claude Code — handle massive contexts (10M+ tokens) by treating them as external variables
Implementation of Recursive Language Model paper from scratch
MCP to optimize Claude code context window and effectively scan large files and code
DSPy's Recursive Language Model (RLM) with Modal Sandbox for secure cloud-based code execution
Benchmark harness for A/B testing Claude Code plugins against OOLONG long-context reasoning tasks. Compare truncation vs RLM-RS recursive chunking strategies. Features Claude Code hooks integration, SQLite persistence, and comprehensive scoring aligned with the OOLONG paper methodology.
Hexagonal architecture implementation of Recursive Language Models (RLM)
📄 Enhance document processing by implementing Recursive Language Models with Claude Code to exceed typical context limits and manage larger inputs effectively.
MCP server implementing Recursive Language Model pattern for processing arbitrarily long contexts. Enables Claude Code to work with 1M+ character documents through session-based chunking, BM25 search, and artifact provenance tracking.
Add a description, image, and links to the recursive-language-model topic page so that developers can more easily learn about it.
To associate your repository with the recursive-language-model topic, visit your repo's landing page and select "manage topics."