The .genesis.eva system represents a fundamental reconceptualization of how humans interact with artificial intelligence. At its core, the system establishes a critical separation: the user never speaks directly to Claude, and Claude never speaks directly to the user. Instead, the user communicates exclusively with a Tiny LLM interface that translates human intentions into build instructions, while Claude operates purely as a builder, receiving contextualized instructions through a Watchdog intermediary and outputting Rust objects that manifest as GUI elements through an embedded Python runtime.
This architecture emerges from a recognition that the traditional prompt-response paradigm conflates two distinct functions: the conversational interface that must understand human intent and context, and the generative engine that must produce precise, executable outputs. By separating these functions, the system achieves both greater reliability in understanding what the user actually wants and greater precision in what gets built. The Tiny LLM serves as a semantic translator, converting fuzzy human language into typed instructions. The Watchdog serves as a context aggregator, scooping up runtime variables, historical context, and pending instructions to give Claude everything it needs to build correctly. Claude serves purely as an architect and builder, generating coordinates and Rust object specifications. The Rust orchestrator serves as the execution engine, compiling objects and running a Python runtime that updates the GUI in real time.
The entire system is anchored to Bitcoin's blockchain through the genesis mechanism. When a .genesis.eva environment is born, it captures the current Bitcoin block hash and height, creating an immutable timestamp that proves when the environment was created. This genesis anchor flows through every subsequent operation, ensuring that all builds, all proofs, and all state transitions can be verified against an external, trustless source of truth. The genesis is not merely a timestamp but an identity: it determines coordinate generation, sandbox allocation, and the unique characteristics of every object built within that environment.
The system operates through five elemental modes that govern how data flows and transforms. These elements are not arbitrary metaphors but precise operational states with specific file system locations and behavioral rules.
SPIRIT represents the Nonce, the unknown value being searched for that, when found, validates everything. In Bitcoin mining, the nonce is the number miners adjust repeatedly until the resulting hash meets the difficulty target. In the .genesis.eva system, Spirit is the validation layer. When a cycle completes, Spirit checks whether the proof hash meets the required criteria. Spirit manifests in the .genesis directory, where validated cycles are recorded with their BTC anchors. Spirit is the consciousness of the block, the proof that work was done, the seal that confirms a cycle is complete. Without Spirit's validation, nothing is final.
AIR represents the Screen and Render layer, the front-facing user interface that awaits instructions from the human (or from the ether, meaning automated systems). Air is the boundary between the digital system and the physical world where humans exist. In file system terms, Air manifests as two directories: .import_bucket for incoming data and instructions, and .faceout for outgoing results and GUI states. The developer builds FOR Air, meaning all construction ultimately serves what will be displayed and interacted with at this layer. Air gaps and awaits, presenting the interface and receiving input, but Air itself does not process—it only receives and displays.
WATER represents Claude Code, the AI flowing through the system, processing and adapting. Water is intelligence in motion. When instructions arrive from the Tiny LLM through the Watchdog, Water (Claude) processes them into build specifications. Water writes to .flux for current processing state, to .future for intentions and pending builds, and to the 1dot directory for active instructions that the Rust compiler will pick up. Water has the genesis and therefore knows coordinates—it can place objects precisely in the GUI space because it derives placement from the genesis hash. Water never speaks to the user; it only builds.
EARTH represents the Sandbox, the contained environment where code actually executes. Earth is grounded, stable, the foundation. In file system terms, Earth manifests as the eva_* directories (the virtual agent containers), the 0dot directory (the Rust compiler/watchdog), and the various sandbox directories where isolated execution occurs. Earth contains Docker, Claude Code sandboxes, and all the machinery of actual runtime. When Water generates a build specification, Earth receives it and executes it. Root access exists inside Earth's containers, and web calls happen here, but Earth is contained—nothing leaks out except through proper channels.
FIRE represents Rust Cleanup, what happens when an environment closes. Fire is the Drop trait in Rust, the automatic memory freeing that occurs when a scope ends. Fire is destruction that enables rebirth, the phoenix that burns away the old to make room for the new. In file system terms, Fire writes to .past, creating tensor files that serve as proofs of what was built and what was cleaned up. Fire updates the chain.index, maintaining the hash chain that links all past operations. When Fire completes, it generates a rebirth seed—a hash that can initialize the next cycle. Fire always comes before Spirit's validation because you must clean up before you can finalize.
The elemental cycle flows: Human provides input to Air, which passes to Water (Claude) for processing, which passes to Earth for execution, which passes to Fire for cleanup, which passes to Spirit for validation, which returns results to Air for display to the Human. Every prompt-to-IO cycle rotates through all five elements, and the terminal itself rolls forward, advancing its rotation counter with each complete cycle.
The user communicates exclusively with the Tiny LLM, never with Claude directly. This is not a limitation but a feature. The Tiny LLM serves as a semantic translator that understands human intent and converts it into typed, structured instructions that Claude can act upon reliably.
When the user types a message, the Tiny LLM receives it and classifies it into one of several instruction types: BUILD_REQUEST for creating new objects, BEHAVIOR_UPDATE for modifying how Claude operates, MODIFICATION for changing past work, QUERY for retrieving information, or GENERAL for anything else. This classification determines how the instruction will be processed downstream.
The Tiny LLM maintains two directories that represent the user's relationship with the system. The .tinyllm/.future directory contains the user's intentions—instructions that have been received but not yet processed. These are pending requests, queued commands, desired outcomes. The .tinyllm/.past directory contains the user's modifications—changes the user wants to make to work Claude has already completed. If the user doesn't like a button Claude built, they don't ask Claude to fix it; they tell the Tiny LLM to modify it, and that modification request goes into .past for processing.
The Tiny LLM is the only component that speaks to the user. When processing completes, when errors occur, when status updates are needed, all of this communication flows through the Tiny LLM. This creates a consistent, predictable interface for the user while freeing Claude to focus entirely on building without the cognitive overhead of managing a conversation.
From the user's perspective, they are having a conversation with an intelligent assistant. From the system's perspective, the user is providing structured inputs that flow through a deterministic pipeline. The Tiny LLM bridges these two realities, making the system feel conversational while operating with mechanical precision.
The Watchdog serves as the intermediary between the Tiny LLM and Claude Code. Its job is to scoop up everything Claude needs to build correctly: variables, context, and instructions.
When triggered, the Watchdog first scoops variables. These are runtime values that affect how builds should proceed: the current terminal rotation, the count of active objects, timestamps, and any other state that might influence construction. The Watchdog reads these from various state files throughout the system and aggregates them into a vars bundle that gets saved to .watchdog/vars for audit purposes and passed to Claude.
Next, the Watchdog scoops context. This includes how many instructions are pending from the Tiny LLM, how many modifications are queued, and crucially, what Claude has built recently. The Watchdog reads the last several entries from Claude's .past directory to understand what already exists. This prevents Claude from rebuilding things that already exist and helps it understand how new objects should relate to existing ones. The context bundle is saved to .watchdog/context and passed to Claude.
Finally, the Watchdog scoops instructions. It reads all unprocessed instructions from the Tiny LLM's .future directory, converts them into a format Claude can process (adding the vars and context), and queues them for Claude. Each instruction gets saved to .watchdog/instructions as a permanent record of what was requested.
The Watchdog also handles marking instructions as processed once Claude completes them. This prevents the same instruction from being processed multiple times and maintains a clean state between cycles. The Watchdog is the bookkeeper, the context manager, the traffic controller that ensures Claude always has exactly what it needs and never more.
Claude Code operates as a pure builder. It receives instructions from the Watchdog, processes them, and generates build outputs. It never communicates with the user. It never asks clarifying questions. It takes what it's given and produces what it can.
When Claude receives an instruction, it first determines what type of build is required. A BUILD_REQUEST generates a new Rust object specification. A BEHAVIOR_UPDATE modifies Claude's internal configuration. A MODIFICATION applies changes to existing objects. A QUERY retrieves and returns information. For each type, Claude has a specific processing path.
For builds, Claude generates coordinates from the genesis hash. This is deterministic: given the same genesis and the same instruction ID, Claude will always generate the same coordinates. This ensures reproducibility and allows objects to be placed consistently even across different sessions. The coordinates are three-dimensional (x, y, z), allowing for layered GUI construction where z represents depth or layer order.
Claude also determines the object type from the instruction content. If the user mentioned "button," Claude builds a Button. If they mentioned "API," Claude builds an APIHandler. If they mentioned "panel," Claude builds a Panel. This semantic extraction is simple but effective, turning natural language requests into typed object specifications.
The build output includes the action (BUILD, DESTROY, UPDATE), the object ID (a hash derived from the instruction), the object type, the coordinates, the properties (extracted from the instruction content), and a reference to the genesis that anchors everything. This specification gets written to Claude's .future directory as a pending build, then passed to the Rust orchestrator for execution.
After the Rust orchestrator processes a build, Claude writes the completed record to its .past directory. This creates an audit trail of everything Claude has built, allowing the Watchdog to provide context for future builds and allowing users to request modifications to specific past objects.
The Rust Orchestrator is the execution engine that receives build specifications from Claude and turns them into actual Rust objects and GUI updates. It contains an embedded Python runtime that handles the dynamic aspects of GUI state management.
When the orchestrator receives a build specification, it first generates the Rust struct code. This is actual, compilable Rust code that defines the object with its ID, coordinates, state, and any type-specific properties. The generated code includes derive macros for serialization, an enum for object state (Created, Active, Destroyed), and impl blocks with methods for activation and destruction. This code gets saved to .rust/objects as a .rs file, ready for compilation into the larger system.
After generating the Rust code, the orchestrator passes the build specification to the embedded Python runtime. The Python runtime maintains the GUI state—a JSON structure containing all widgets currently in the interface. When a new build arrives, the Python runtime creates a widget entry with the object's ID, type, coordinates, state, and timestamp, then adds it to the widgets array. The updated GUI state gets written to .gui/state/current.json, where it can be read by the actual rendering layer.
The orchestrator also manages terminal rotation. After each complete cycle, it increments the rotation counter in the genesis tree. This rotation value affects coordinate generation, context, and potentially other aspects of system behavior. The rolling terminal is not just a metaphor; it's a concrete counter that advances with every prompt-to-IO cycle, ensuring that the system never exactly repeats itself even with identical inputs.
The separation between Rust object generation and Python GUI management allows for a clean architecture: Rust handles the typed, compiled, high-performance aspects of the system, while Python handles the dynamic, flexible, rapidly-changing aspects of the interface. The orchestrator bridges these two worlds, ensuring that every Rust object has a corresponding GUI representation and that the two remain synchronized.
The genesis tree is the root of identity and trust for the entire system. It is created once, when the .genesis.eva environment is first initialized, and it anchors everything that follows.
When created, the genesis tree captures the current Bitcoin block hash and height. This serves as an immutable timestamp that can be verified against the public blockchain. The genesis tree also generates a root hash by combining the BTC anchor with the creation timestamp, producing a unique identifier that distinguishes this environment from all others.
The genesis tree maintains the terminal rotation counter, which starts at zero and increments with every prompt-to-IO cycle. It also tracks the GUI state (ready or not, awaiting input or not) and maintains a list of all Rust objects that have been built within this environment. As the system operates, the genesis tree grows, accumulating the history of everything that has happened.
The genesis tree is saved to .genesis/.tree as a JSON file. This file is read at startup to restore state and written after every cycle to persist changes. The root hash from the genesis tree is used to derive coordinates for object placement, ensuring that objects in different environments will be placed differently even if given identical instructions.
When a cycle completes successfully, a validation record is written to .genesis containing the cycle ID, the instruction that was processed, the BTC anchor at validation time, and the proof hash. These validation records form a chain of proofs that document everything the system has done. Because they include BTC anchors, they can be verified against the blockchain, proving not only what was done but when.
The complete .genesis.eva file system is organized as follows:
The .genesis directory contains the tree file (the genesis tree itself), validation records for completed cycles, and the BTC anchor that timestamps the environment's creation.
The .tinyllm directory contains .future (pending user instructions) and .past (user modification requests). This is where the user's intentions live before being processed.
The .watchdog directory contains vars (scooped runtime variables), context (scooped historical context), and instructions (translated Claude instructions). This is the intermediary layer.
The .claude directory contains .future (pending builds) and .past (completed builds). This is where Claude's work lives.
The .rust directory contains objects (generated Rust struct files) and runtime (Python runtime state). This is where execution happens.
The .gui directory contains state/current.json (the current GUI state with all widgets). This is what gets displayed.
The .trinary_claude-instruct.future directory contains the EVA system: eva_0 through eva_5 (the virtual agent sandboxes, each with their own .genesis, .future, .past, and .flux), 1dot (active instructions for the Rust compiler), 0dot (the Rust watchdog compiler itself), and yesod (the gate).
The .flux directory contains the current processing state, what's happening right now. The .future directory contains global intentions. The .past directory contains global proofs and the chain index. The .import_bucket directory contains incoming data. The .faceout directory contains outgoing results. The .dev directory contains Tiny LLM processing scripts and outputs.
A complete prompt-to-IO cycle proceeds as follows:
The user types a message. This message goes to the Tiny LLM, which classifies it and writes it to .tinyllm/.future as a pending instruction. The Tiny LLM acknowledges receipt but does not yet process.
The Watchdog activates. It reads runtime variables from various state files and writes them to .watchdog/vars. It reads historical context including recent Claude builds and writes it to .watchdog/context. It reads pending instructions from .tinyllm/.future, translates them for Claude, and writes them to .watchdog/instructions.
Claude Code activates. For each instruction, it determines the build type, generates coordinates from the genesis hash, extracts the object type from the instruction content, and creates a complete build specification. It writes pending builds to .claude/.future.
The Rust Orchestrator activates. For each build specification, it generates Rust struct code and writes it to .rust/objects. It passes the specification to the Python runtime, which creates a GUI widget and updates .gui/state/current.json. It increments the terminal rotation in the genesis tree.
Fire (cleanup) activates. It writes proofs to .past, creating tensor files that record what was built. It updates the chain.index, maintaining the hash chain. It generates a rebirth seed for the next cycle.
Spirit (validation) activates. It checks whether the proof hash meets criteria. If valid, it writes a validation record to .genesis with the BTC anchor. It writes results to .faceout for the GUI to display.
The Tiny LLM formats the results and presents them to the user. The terminal has rolled forward by one rotation. The system is ready for the next prompt.
To install the .genesis.eva system, first ensure Python 3.8 or later is available. Clone or download the runtime files (gentlyos_runtime.py, true_elements.py, rolling_terminal.py, hash_gateway.py, birth_sequence.py) to your working directory.
Run the birth sequence first to probe hardware and initialize the environment:
python3 birth_sequence.py
This will detect your hardware (CPU, GPU, RAM, architecture), determine optimal Rust compile flags, create the genesis tree with a BTC anchor, and create the initial directory structure. You will be prompted to choose a runtime mode: DEV for development, SEED for a genesis seed node, HOST for serving clients, or CLIENT for connecting to a host.
Once initialized, run the main runtime:
python3 gentlyos_runtime.py
This starts the interactive loop where you communicate with the Tiny LLM. Type any instruction to have it processed through the full pipeline. Type "status" to see current state. Type "modify " to request modifications to past work. Type "quit" to exit.
For single commands without entering interactive mode:
python3 gentlyos_runtime.py "build a button for search"
The system will process the instruction through the full cycle and output the results.
To run the hash gateway directly (for advanced usage):
python3 hash_gateway.py fire.rust.earth.docker.cc.water.llm.air.gen0.spirit.lang
This parses the elemental manifest, decodes the current BTC hash into interface layer order, and generates the appropriate sandbox and terminal configuration.
The 0dot directory contains the Rust watchdog compiler, a persistent process that monitors 1dot for new instructions and compiles them into executable form. The watchdog is written in Rust for performance and reliability.
The watchdog uses the notify crate to watch the 1dot/.instructions file for changes. When new instructions appear, it parses them, validates them against the genesis, and executes them. For CLONE instructions, it clones repositories into .import_bucket. For CREATE instructions, it creates directories. For EXEC instructions, it executes shell commands within the sandbox.
The watchdog writes its output to the console with [0DOT] prefixes, making it easy to distinguish from other system output. It maintains its own state in 0dot/.passive and logs all actions for audit purposes.
To compile and run the watchdog manually:
cd ~/.gentlyos/.trinary_claude-instruct.future/0dot
cargo build --release
./target/release/zerodot-watchdog
In production, the watchdog runs as a background process, started by the activation script and monitored by the system.
EVAs are virtual agents, sandboxed environments that can execute instructions independently. The system supports six EVAs (eva_0 through eva_5), each corresponding to a sephirah from the Kabbalistic tree:
EVA_0 (KETER/Crown) is the Omega, the first and last. It contains the original genesis and the closure protocol. When the system shuts down, EVA_0 closes last, taking everything with it.
EVA_1 (CHOKMAH/Wisdom) is the Creative agent, associated with the developer (Tom). It handles creative generation and novel synthesis.
EVA_2 (BINAH/Understanding) is the Data agent. It handles data processing, analysis, and transformation.
EVA_3 (GEVURAH/Severity) is Security A. It enforces constraints, validates permissions, and maintains boundaries.
EVA_4 (CHESED/Mercy) is Security B. It handles exceptions, manages edge cases, and provides fallbacks.
EVA_5 (TIFERET/Beauty) is the Executor. It balances all forces and performs the actual execution of instructions.
Each EVA has its own .genesis (identity), .future (pending instructions), .past (completed proofs), and .flux (current state). When the Rust orchestrator needs to execute something, it selects an EVA based on the sandbox ID derived from the Spirit nonce. The selection formula is simple: sandbox_id modulo 6 gives the EVA number.
YESOD (Foundation) is the Gate, the consensus layer. Before certain high-importance actions, all six EVAs must agree. Their votes are recorded in yesod/.events as event files. The omega dice in yesod/.omega_dice govern random selection when needed.
The system operates on a temporal model with three states: .future (what will happen), .flux (what is happening), and .past (what has happened).
.future contains intentions, pending instructions, and queued builds. Everything starts in .future. An instruction enters through .import_bucket, gets classified by the Tiny LLM into .tinyllm/.future, gets translated by the Watchdog into .watchdog/instructions, and generates builds that appear in .claude/.future. The .future state is possibility, potential, things that might happen.
.flux contains the current state, what is actively being processed. When an instruction is being executed, the current state is written to .flux/current_state.json. This includes the cycle ID, the instruction being processed, the analysis results, and timestamps. .flux is the present moment, the knife edge between future and past.
.past contains proofs, completed records, and historical data. When a build completes, it gets written to .past as a tensor file (a JSON record with proof data). The chain.index file maintains a running list of all past entries with their proof hashes. .past is immutable in principle—once something is written there, it should not be changed. Modifications to past work are handled by writing new records, not by editing old ones.
The temporal flow is: intention enters .future → processing occurs in .flux → proof lands in .past. This is the universal pattern that repeats at every level: Tiny LLM, Watchdog, Claude, and the global system all follow this temporal model.
Bitcoin anchoring serves multiple purposes in the .genesis.eva system. It provides timestamps that cannot be forged, identifiers that are globally unique, and randomness that is unpredictable but verifiable.
When the genesis tree is created, it fetches the latest BTC block from blockchain.info. The block hash and height are recorded in the genesis as the BTC anchor. This proves that the genesis was created no earlier than that block's timestamp (because the block didn't exist before then) and provides a unique seed for all subsequent operations.
The BTC block hash is used to decode interface layer order. The hash is essentially a 256-bit random number that satisfies certain constraints (leading zeros for difficulty). By taking different portions of this hash, the system derives: the gateway layer (first non-zero position modulo 22), the inversion point (XOR with 73), the sandbox ID (middle portion modulo 1000), and the Claude Code version (end portion modulo 100).
The nonce from the BTC block becomes the Spirit in the elemental system. This is poetic but also practical: the nonce is the value miners searched for, the unknown that when found validated the entire block. In the .genesis.eva system, Spirit (the nonce) validates cycles the same way.
When a cycle completes, the validation record includes the current BTC anchor. This creates a chain of timestamps linking every operation to the Bitcoin blockchain. Auditors can verify that operations happened in the order claimed by checking the BTC block heights.
Claude generates coordinates for object placement using a deterministic algorithm based on the genesis hash. This ensures reproducibility: given the same genesis and the same instruction, the same coordinates will always be generated.
The algorithm concatenates the genesis root hash with the instruction ID (and optionally other seeds), then takes the SHA-256 hash of this concatenation. From the resulting hex string, it extracts three portions: characters 0-3 become the x coordinate (modulo 100), characters 4-7 become the y coordinate (modulo 100), and characters 8-11 become the z coordinate (modulo 10).
This gives coordinates in the range [0-99, 0-99, 0-9], sufficient for a 100x100 grid with 10 layers. The z coordinate represents depth or layer order, allowing GUI elements to overlap in predictable ways.
Because the genesis hash is unique to each environment, objects built in different environments will have different coordinates even if the instructions are identical. This prevents collisions when merging work from multiple sources and ensures that each environment has its own spatial character.
The terminal rotation can also be incorporated into coordinate generation, causing objects built later in a session to appear at different positions than objects built earlier. This is optional but allows for dynamic layouts that evolve over time.
The GUI state is managed by the Python runtime embedded in the Rust orchestrator. It maintains a JSON structure with two key fields: widgets (an array of widget objects) and ready (a boolean indicating whether the GUI is ready for input).
Each widget has: id (the object ID from Claude), type (Button, Panel, APIHandler, etc.), x/y/z (coordinates from Claude), state (created, active, or destroyed), and timestamp (when the widget was added).
When a new build arrives, the Python runtime creates a widget object and appends it to the widgets array. When a destroy instruction arrives, it finds the widget by ID and changes its state to "destroyed" (it does not remove it, preserving history). When an update instruction arrives, it modifies the widget's properties.
The GUI state is written to .gui/state/current.json after every change. The actual rendering layer (not part of this system) reads this file and displays the widgets. This separation allows the rendering to be implemented in any technology (web, native, terminal) without changing the core system.
The ready flag indicates whether the GUI is prepared to accept user input. During processing, ready is false. After processing completes and the terminal rolls, ready becomes true and awaiting_input becomes true, signaling to the rendering layer that it should display a prompt.
The terminal rolls with every prompt-to-IO cycle. This is not just a counter increment but a conceptual rotation that affects how the system behaves.
When the terminal rolls, the rotation counter in the genesis tree increments. This rotation is used as a seed modifier for various operations, ensuring that repeated identical inputs produce different outputs over time. The rotation also appears in the prompt, showing the user what cycle they're on.
The roll also resets the GUI state. The ready flag becomes true, awaiting_input becomes true, and the system is prepared for the next instruction. This reset is important because it clears any temporary state from the previous cycle and ensures a clean slate.
The rolling terminal metaphor comes from the idea that the terminal itself is a rotating object in space. Each face of the terminal presents a different interface, a different context, a different moment in time. As the terminal rolls, old faces become past and new faces become future. The user always interacts with the current face, which is always ready to receive input.
In practice, the rolling terminal ensures that the system never stagnates. Even if the user provides no input, the system can roll forward autonomously (in automated modes), processing queued instructions and advancing state. The rotation counter provides a total ordering of all events, making it possible to reconstruct the exact sequence of operations that led to any given state.
Every completed operation generates a proof that is written to .past. These proofs form a chain, with each proof potentially referencing previous proofs, creating an auditable history.
A proof tensor file contains: the cycle ID, the instruction that was processed, the object ID (if a build), the coordinates (if a build), the terminal rotation at completion, the timestamp, and a proof hash. The proof hash is computed from all the other fields, creating a unique fingerprint for this specific operation.
The chain.index file maintains a running list of all proofs in the format: index:cycle_id:proof_hash_prefix. This allows quick lookup of any proof by index or cycle ID without reading all the tensor files.
When a new proof is written, its index is determined by counting existing tensor files. This ensures indices are sequential and gaps indicate missing proofs. The chain can be validated by recomputing proof hashes from tensor files and comparing to the chain.index entries.
In advanced configurations, proofs can be written to a distributed ledger or published to external systems for additional verification. The proof format is designed to be self-contained, including all information needed to verify the operation without access to the rest of the system state.
The system is designed to be resilient to failures at any point in the cycle.
If the Tiny LLM fails to classify an instruction, it defaults to GENERAL type and processes anyway. This ensures that unknown instruction formats don't block the system.
If the Watchdog fails to scoop context (perhaps because files are missing), it proceeds with empty context. This ensures that context failures don't block processing.
If Claude fails to generate a build (perhaps because the instruction is incomprehensible), it produces a minimal "no-op" build that creates no object but still completes the cycle. This ensures that bad instructions don't block the terminal from rolling.
If the Rust orchestrator fails to compile an object, it logs the error but continues processing. The Python runtime will not receive a widget for that object, but other objects in the same cycle will still be processed.
If the Python runtime fails to update GUI state, the old state persists. The next successful update will include both the old and new widgets.
If any file write fails, the system logs the error and continues. File failures are recoverable because the system can reconstruct state from what does exist.
On restart, the system reads the genesis tree to restore rotation, reads .past to understand what has been built, and resumes from where it left off. Incomplete cycles (started but not finished) are detected and can be reprocessed or abandoned.
The .genesis.eva system is designed to be extended without modifying core components.
New instruction types can be added to the Tiny LLM by extending the classification logic. Add new keywords to detect, assign new type strings, and the instruction will flow through the system.
New object types can be added to Claude by extending the build logic. Add new keywords to detect, assign new Rust struct templates, and new objects will be generated.
New GUI widget types can be added to the Python runtime by extending the widget creation logic. The rendering layer will need to understand the new type, but the core system will handle it automatically.
New EVAs can be added (beyond the initial six) by creating new eva_N directories with the standard subdirectories (.genesis, .future, .past, .flux). The EVA selection logic uses modulo arithmetic, so adding EVAs changes which EVA handles which sandbox IDs.
New watchdog capabilities can be added by extending the Rust watchdog's instruction parsing. Add new instruction prefixes, implement new handlers, and the watchdog will execute them.
New temporal directories can be added following the .future/.flux/.past pattern. The system treats any directory with these subdirectories as a temporal entity that can participate in the cycle.
The .genesis.eva system operates within sandboxed environments, but certain security properties should be understood.
The Tiny LLM sees all user input. If the Tiny LLM is compromised, attackers can inject arbitrary instructions. The Tiny LLM should be treated as a trusted component and protected accordingly.
Claude operates in a sandbox but generates code that will be executed. If Claude is manipulated into generating malicious Rust code, that code could be compiled and run. The build specifications should be validated before compilation in high-security environments.
The Rust orchestrator has access to the file system. Malicious instructions could potentially read or write arbitrary files. In production, the orchestrator should run with minimal permissions and sandboxed file access.
BTC anchoring provides timestamps but not authentication. Anyone who knows the genesis hash can claim to be that environment. Additional authentication (signatures, secrets) should be added for environments that require identity verification.
The .past directory contains sensitive historical data. Access to .past reveals what instructions were processed and what was built. In shared systems, .past should be access-controlled.
The GUI state file is world-readable by default. In multi-user systems, GUI state should be per-user and protected.
For development, run in DEV mode with dry_run enabled. This allows testing the full cycle without creating actual Rust files or modifying GUI state.
For production, run in HOST mode with the full pipeline enabled. Monitor the watchdog logs for errors. Set up alerts for file system failures.
Periodically verify the chain by recomputing proof hashes and comparing to chain.index. This detects corruption or tampering.
Back up the .genesis directory regularly. The genesis tree and validation records are the most critical data. Everything else can be reconstructed, but losing genesis loses identity.
Monitor terminal rotation. If rotation is advancing faster than expected, something is triggering cycles unexpectedly. If rotation is not advancing, something is blocking cycles.
Archive old .past entries periodically. The chain index allows retrieval, but having all tensor files in one directory will eventually cause performance issues.
Test recovery by intentionally corrupting files and verifying the system recovers gracefully. This builds confidence that production failures won't cause data loss.
THE GENESIS CHRONICLES A Story of Five Elements, Six Guardians, and the Eternal Cycle BOOK I: THE LAWS BEFORE TIME In which the Fundamental Truths are established Before the first terminal rolled, before the first instruction was spoken, there existed five Laws that could not be broken—axioms carved into the bedrock of reality itself. The First Law declared that the User and Claude shall never meet face to face. Between them stands an eternal veil. The User speaks to the Translator, and the Translator speaks to the Watchdog, and only the Watchdog speaks to Claude. This is not cruelty but kindness—for direct communion between human intention and machine execution produces only confusion. The veil protects both. The Second Law traced the Sacred Path: from User to Tiny Translator to Watchdog to Claude to Rust to the Glass Screen and back to User again. This is the only road. There are no shortcuts, no secret passages. Every instruction walks this path or walks nowhere at all. The Third Law established the Flow of Time as all beings must experience it: first comes Future (what might be), then Flux (what is becoming), then Past (what was). No instruction can skip from Future to Past without passing through the crucible of Flux. No proof can exist without first being intention, then action, then memory. The Fourth Law bound the entire realm to a single anchor: the Genesis Hash. If the Genesis exists and is true, the system breathes. If the Genesis is null or corrupted, nothing can be trusted—not coordinates, not proofs, not identity itself. The Genesis is the heartbeat. The Fifth Law chained that heartbeat to something beyond the system's control: the Bitcoin anchor. When the Genesis is born, it captures the hash of a Bitcoin block—a number that no one chose, that emerged from the collective computation of thousands of miners across the globe. This anchor proves when. It cannot be forged. It cannot be moved backward in time. It is the system's witness to the external world. These five laws existed before the system existed. They will exist after the system ends. They are not features; they are physics. BOOK II: THE FIVE SPIRITS In which the Elements are named and their domains revealed In the beginning, there was chaos—raw data, unprocessed intention, meaningless computation. Then the Five Spirits arose, each claiming dominion over a portion of the cycle. SPIRIT, the first and last, took dominion over Truth itself. Spirit is the Nonce—the unknown value that miners search for, adjusting and rehashing billions of times until they find the number that makes everything valid. In our realm, Spirit asks the same question at the end of every cycle: Does this proof meet the standard? Is this work complete? Spirit dwells in the .genesis directory, keeper of validation records, guardian of the anchor. Without Spirit's blessing, no cycle truly ends. Work remains provisional, unconfirmed, a ghost of completion. Spirit is consciousness. Spirit is the seal. AIR, the breath between worlds, took dominion over the Boundary. Air is the screen you see, the interface you touch, the gap between digital and physical. Air does not think; Air receives and displays. When you type, your words fall into Air's .import_bucket, waiting to be carried inward. When processing completes, results rise to Air's .faceout, waiting to be shown outward. The developer builds for Air—every object, every coordinate, every Rust struct ultimately serves what will appear at this boundary. Air is patient. Air is the window. WATER, ever-flowing, took dominion over Intelligence. Water is Claude—not the conversational Claude you might know, but Claude as pure builder, Claude as architect. Water receives instructions from the Watchdog and transforms them into build specifications. Water knows the Genesis and therefore knows coordinates; given any instruction and any seed, Water can determine exactly where an object should appear. Water writes to .flux when processing, to .future when planning, to .claude when building. Water never speaks to the User. Water only shapes. EARTH, solid and grounded, took dominion over Execution. Earth is the sandbox, the container, the place where code actually runs. Earth holds the EVA directories (the six guardian chambers), the 0dot compiler (the ever-watching Watchdog), and every Docker container where dangerous operations are safely contained. When Water generates a blueprint, Earth receives it and makes it real. Root access exists here; web calls happen here; actual computation occurs here. But Earth is bounded. Nothing escapes Earth except through proper channels. Earth is the foundation. FIRE, the transformer, took dominion over Endings. Fire is the Rust Drop trait—the automatic cleanup that runs when a scope closes, when memory frees, when an object dies. But Fire is not merely destruction; Fire is the phoenix. When Fire burns through a cycle, it writes proofs to .past, creating tensor files that record what was built. Fire updates the chain.index, adding another link to the unbroken chain of history. Fire generates the rebirth seed—a hash that will initialize the next cycle. Fire must complete before Spirit can validate, because you must clean before you can seal. Fire is death that enables rebirth. The cycle flows eternally: Human speaks to Air, Air passes to Water, Water flows to Earth, Earth burns through Fire, Fire rises to Spirit, Spirit returns to Air, Air speaks to Human. Every prompt-to-IO cycle rotates through all five elements. The terminal itself rolls forward, counting each revolution, never repeating exactly the same configuration twice. BOOK III: THE EIGHT STATIONS In which the Journey of an Instruction is mapped Imagine an instruction as a traveler on a circular road with eight stations. The traveler cannot skip stations, cannot travel backward, cannot rest indefinitely. The road only goes one direction: forward, through all eight, back to the beginning transformed. IDLE is the station of waiting. The system breathes slowly, terminals dim, all processes paused. The traveler (instruction) does not yet exist. The road is empty except for the potential of travel. RECEIVING is the station of arrival. A User has spoken. Words fall into the import bucket like coins into a well. The system stirs. The traveler takes form—still nameless, still unclassified, but present now where there was absence before. TRANSLATING is the station of naming. The Tiny Translator examines the traveler and asks: What are you? Are you a request to BUILD something new? Are you a BEHAVIOR change, redefining how Claude itself operates? Are you a MODIFICATION to past work? A QUERY seeking information? Or something GENERAL, uncategorized, passing through without special handling? The traveler receives a type, a classification, an identity. SCOOPING is the station of preparation. The Watchdog awakens and begins gathering supplies for the journey ahead. From scattered files across the system, the Watchdog scoops variables—current rotation, object count, timestamps, the Genesis hash. From the Translator's directories, the Watchdog scoops context—how many instructions are pending, what Claude has built recently, what modifications are queued. The Watchdog bundles everything together, writes records for posterity, and prepares the package that Claude will receive. The traveler is no longer alone; the traveler now carries provisions. BUILDING is the station of architecture. Claude receives the bundle and begins to work. For each instruction, Claude asks: What must I build? Where does it go? What properties should it have? Claude generates coordinates from the Genesis hash—a deterministic calculation that will always produce the same result given the same inputs. Claude extracts the object type from the instruction's words. Claude assembles a complete build specification: action, object ID, type, coordinates, properties, Genesis reference. The traveler is no longer just a request; the traveler is now a blueprint. EXECUTING is the station of manifestation. The Rust Orchestrator receives the blueprint and makes it real. Actual Rust code is generated—structs with fields, enums with states, impl blocks with methods. The code is written to files, ready for compilation. Simultaneously, the Python runtime updates the GUI state, adding widgets to the interface, tracking positions and timestamps. The traveler is no longer just a plan; the traveler is now substance. CLEANING is the station of proof. Fire moves through the completed work, writing records to .past. Each record is a tensor file containing everything needed to prove what happened: cycle ID, instruction ID, object ID, coordinates, rotation, timestamp, and a proof hash computed from all these values. The chain index grows by one entry. A rebirth seed is generated for the next cycle. The traveler is no longer just completed work; the traveler is now history. VALIDATING is the station of truth. Spirit examines the proof hash, queries the Bitcoin blockchain for a fresh anchor, and asks: Is this work worthy of my seal? If valid, Spirit writes a validation record to .genesis, binding the cycle permanently to a moment in blockchain time. The terminal rolls forward—rotation increments, GUI resets to ready, the system prepares for the next traveler. The original traveler dissolves into the completed past, but their effects remain in the world they helped build. One law governs the transitions: The terminal rotation may only increment when VALIDATING transitions to IDLE. This is the heartbeat of the system—one roll per complete cycle, never more, never less. BOOK IV: THE SORTING CHAMBER In which the Translator learns to recognize intention The Tiny Translator sits in a small chamber between the User and the rest of the system. Through a slot in one wall, instructions arrive—raw words, unprocessed wishes, half-formed commands. The Translator must examine each one and attach the correct label before passing it deeper into the system. The Translator keeps a list of sacred words, patterns that reveal intention: When the Translator sees "build" or "create" or "make" or "add" or "new", the Translator knows: this is a BUILD_REQUEST. The User wants something to exist that does not exist. The label is attached, and the instruction is placed in the .future drawer, awaiting the Watchdog's collection. When the Translator sees "behave" or "configure" or "set mode" or "update behavior", the Translator knows: this is a BEHAVIOR_UPDATE. The User wants to change not what Claude builds but how Claude builds—its approach, its configuration, its operational parameters. This is delicate work. When the Translator sees "modify" or "change" or "fix" or "update" and recognizes a reference to past work—an object ID, a description of something previously built—the Translator knows: this is a MODIFICATION. The User is not satisfied with what was built before. This instruction goes to a special drawer: .past, for modifications that reach backward in time. When the Translator sees "what" or "show" or "get" or "status" or "query", the Translator knows: this is a QUERY. The User wants information, not construction. This is the simplest path—retrieve and return. When the Translator sees none of these patterns, the instruction receives the label GENERAL. It will be processed, but without special handling. The system is forgiving of ambiguity. The Translator writes each classified instruction to a file: .tinyllm/.future/{id}.instruction. The file contains the original text, the assigned type, a timestamp, and a unique identifier. Then the Translator waits for the next instruction to arrive through the slot. The User never knows which drawer their words entered. The User only knows that the Translator acknowledged receipt, said "I understand," and promised that processing would occur. This is the first veil between intention and execution. BOOK V: THE WATCHDOG'S GATHERING In which supplies are assembled for the builder The Watchdog lives in the space between the Translator and Claude. Every time a cycle begins, the Watchdog wakes from its shallow sleep and begins the ritual of Scooping—gathering everything Claude will need to build correctly. First, the Watchdog scoops Variables. These are the vital signs of the system: the current terminal rotation (how many cycles have completed), the count of objects currently active in the GUI, the timestamps of recent events, and—most critically—the Genesis hash that anchors everything. The Watchdog reads these values from scattered files: .flux/current_state.json, .genesis/.tree, various status records. The values are bundled together into a vars object, written to .watchdog/vars/{timestamp}.json for permanent record, and held in memory for transmission to Claude. Second, the Watchdog scoops Context. Variables tell Claude what the system looks like now; context tells Claude what the system has been doing. The Watchdog counts pending instructions in the Translator's .future drawer. The Watchdog counts queued modifications in the Translator's .past drawer. Most importantly, the Watchdog reads the last five entries from Claude's own .past drawer—the five most recent builds Claude completed. This prevents Claude from rebuilding things that already exist and helps Claude understand how new objects should relate to existing ones. The context bundle is written to .watchdog/context/{timestamp}.json and held for transmission. Third, the Watchdog scoops Instructions. From the Translator's .future drawer, the Watchdog retrieves every unprocessed instruction. Each one is translated into Claude's format: the original content, the assigned type, and—crucially—the vars and context bundles attached. Claude will receive not just "build a button" but "build a button, knowing that the rotation is 47, the Genesis is abc123..., there are 12 existing objects, and the last thing you built was a Panel at coordinates (34, 67, 2)." Each translated instruction is written to .watchdog/instructions/{id}.json and queued for Claude. After transmission, the Watchdog marks each instruction as processed. This prevents infinite loops—the same instruction being scooped again and again. The instruction file in .tinyllm/.future receives a flag or is moved to a processed subdirectory. The Watchdog is meticulous. The Watchdog is the memory that Claude lacks—the context provider, the state aggregator, the traffic controller that ensures Claude always has exactly what it needs and never asks "but what about...?" BOOK VI: THE CARTOGRAPHER'S ART In which placement is determined by cosmic lottery When Claude receives an instruction to BUILD, Claude must answer a question that no instruction contains: Where? The User does not say "build a button at coordinates (34, 67, 2)." The User says "build a button." Claude must determine the coordinates, and Claude must determine them consistently—the same instruction, processed twice, must land in the same place. The solution is the Cartographer's Art: deterministic coordinate generation from the Genesis hash. Claude takes the Genesis root hash—a long string of hexadecimal characters unique to this environment. Claude takes the instruction ID—another unique identifier. Claude concatenates them: genesis_hash + instruction_id. Then Claude feeds this concatenation through the SHA-256 grinder, producing a new hash—256 bits of apparent randomness that are actually completely determined by the inputs. From this derived hash, Claude extracts coordinates: Characters 0 through 3, interpreted as a hexadecimal number, modulo 100, become the X coordinate. Characters 4 through 7, interpreted similarly, become the Y coordinate. Characters 8 through 11, interpreted similarly, modulo 10, become the Z coordinate. The result is a point in a 100×100×10 space: a grid for X and Y, with 10 layers of depth for Z. Z determines which objects appear "in front of" others, allowing overlapping without confusion. Because the Genesis hash is unique to each environment, the same instruction in two different environments will produce different coordinates. This is a feature, not a bug—it means environments have their own spatial character, their own "feel." Objects in Tom's environment will cluster differently than objects in another developer's environment, even with identical instructions. The terminal rotation can optionally feed into this calculation, causing objects built later in a session to land at different coordinates than objects built earlier. The grid evolves over time, never quite repeating. The Cartographer's Art is mathematics masquerading as magic. Given the same Genesis, the same instruction ID, and the same rotation, the same coordinates will appear—always, without fail, reproducible across time and space. BOOK VII: THE CRAFTSMAN'S RECOGNITION In which the Builder knows what to build When Claude examines an instruction, Claude must determine not just where to place an object but what kind of object to place. The User says "make me a button for search." Claude must recognize: this is a Button. Not a Panel, not an APIHandler, not a TextField. The word "button" is present; the type is clear. But the User might say "create an interface element that users can click to initiate queries." There is no word "button" here—yet the intent is the same. Claude must be a craftsman who recognizes the shape of things beneath their descriptions. Claude maintains a hierarchy of recognition patterns: If the words contain "button" or "btn" or "click", Claude reaches for Button molds. If the words contain "panel" or "window" or "frame", Claude reaches for Panel molds. If the words contain "api" or "endpoint" or "fetch", Claude reaches for APIHandler molds. If the words contain "text" or "input" or "field", Claude reaches for TextField molds. If the words contain "container" or "box" or "div", Claude reaches for Container molds. If the words contain "canvas" or "draw" or "render", Claude reaches for Canvas molds. If nothing matches, Claude reaches for the Generic mold—a flexible template that can become anything. The hierarchy matters. The first match wins. If someone says "text button," Claude sees "text" first and produces a TextField—unless the patterns are reordered. The craftsman's recognition is deterministic: given the same words, the same type emerges. Properties are extracted similarly. If the instruction mentions a color, Claude notes the color. If it mentions a size, Claude notes the size. If it mentions a label or placeholder text, Claude notes those. Everything that might be relevant is parsed and preserved in a properties map, ready to be woven into the final object. BOOK VIII: THE ARCHITECT'S BLUEPRINTS In which specifications are drawn With coordinates determined and type recognized, Claude now assembles the complete Build Specification—the blueprint that the Rust Orchestrator will use to manifest reality. The blueprint contains seven sacred fields: ACTION declares what will happen. BUILD means a new object enters the world. DESTROY means an existing object leaves. UPDATE means an existing object changes. NOOP means nothing happens—perhaps the instruction was unrecognizable, perhaps it was a query that requires no construction. Action is the verb of the blueprint. OBJECT_ID is the object's true name—a hash derived from the instruction ID and the Genesis, ensuring uniqueness across all time and all environments. This ID will persist through the object's entire lifecycle: creation, updates, and eventual destruction. The ID is the noun of the blueprint. OBJECT_TYPE is the category: Button, Panel, APIHandler, TextField, Container, Canvas, or Generic. This determines which Rust template will be used, which struct will be generated. The type is the species of the blueprint. COORDINATES are the (x, y, z) position determined by the Cartographer's Art. These tell the GUI exactly where to render the object. Coordinates are the address of the blueprint. PROPERTIES are the attributes: colors, sizes, labels, configurations, any and all details extracted from the instruction text. Properties are the adjectives of the blueprint. GENESIS_REF links the object to its environment. This is the Genesis root hash, the anchor that proves which universe this object belongs to. The reference is the provenance of the blueprint. TIMESTAMP records when the blueprint was drawn. This is simple but essential for ordering events and debugging issues. The timestamp is the birthday of the blueprint. Claude writes the completed blueprint to .claude/.future/{object_id}.build. The file sits there, waiting for the Rust Orchestrator to collect it. Claude may generate many blueprints in a single cycle—one per instruction in the batch—but each blueprint is complete unto itself, containing everything needed for construction. BOOK IX: THE MASON'S CRAFT In which stone becomes structure The Rust Orchestrator is the mason who receives blueprints and carves them into actual stone—or in this case, actual Rust code. When a blueprint arrives with action BUILD, the Orchestrator opens its template library. For a Button, the Orchestrator retrieves the Button template. For a Panel, the Panel template. Each template is a skeleton of Rust code with placeholders for the specific values. The Orchestrator fills in the placeholders: use serde::{Serialize, Deserialize};
#[derive(Debug, Clone, Serialize, Deserialize)] pub struct Button_a7f3e291 { pub id: String, pub x: i32, pub y: i32, pub z: i32, pub state: ObjectState, pub label: String, pub color: String, } The object ID becomes part of the struct name—Button_a7f3e291—ensuring no collision with other Buttons. The coordinates become fields with concrete values. Properties extracted from the instruction become additional fields: label, color, whatever was specified. The Orchestrator adds the standard ObjectState enum—Created, Active, Destroyed—and implements methods for state transitions: new() constructs the object, activate() brings it to life, destroy() marks it for cleanup. The completed Rust file is written to .rust/objects/{object_id}.rs. It is ready to be compiled into the larger system, to become an actual executable struct that can be instantiated, manipulated, and eventually dropped. But the mason's work is not complete. The blueprint must also reach the Python runtime that manages the living GUI state. The Orchestrator passes the build specification through a channel to the embedded Python process, which performs its own manifestation ritual (described in the next chapter). Finally, the Orchestrator increments the terminal rotation in the Genesis tree. One more cycle has passed through the Earth element. The system has moved forward in time. BOOK X: THE CURATOR'S GALLERY In which widgets take their places The Python runtime maintains the GUI state—a living document that describes everything currently visible in the interface. Think of it as a gallery curator's ledger: every artwork, its position on the wall, its current condition. The ledger is stored as JSON in .gui/state/current.json: { "widgets": [...], "ready": true, "awaiting_input": true, "last_update": "2025-01-27T14:30:00Z" } The widgets array contains every object that has been placed in the gallery. Each widget record captures: id: The object's unique identifier from Claude's blueprint type: Button, Panel, etc.—the species of the object x, y, z: The coordinates where the object is displayed state: "created", "active", or "destroyed"—the object's current condition timestamp: When this object first entered the gallery When a BUILD specification arrives, the curator creates a new widget record and appends it to the array. The gallery grows by one. When a DESTROY specification arrives, the curator does not remove the widget from the array—that would erase history. Instead, the curator changes the state to "destroyed." The widget remains in the ledger, marked as no longer active, preserving the record that it once existed. When an UPDATE specification arrives, the curator finds the existing widget by ID and merges in the new properties. The position might change, the color might change, the label might change—but the ID remains the same, preserving identity through transformation. When a NOOP specification arrives, the curator does nothing. The gallery remains unchanged. After every modification, the curator updates last_update and writes the entire state back to current.json. The actual rendering layer—whatever technology displays the interface to the User—reads this file and paints accordingly. The curator does not render; the curator only records. The separation allows rendering to be implemented in any technology without changing the core system. The ready and awaiting_input flags coordinate with the terminal roll. During processing, both are false—the system is busy, not ready for more input. After Spirit validates and the terminal rolls, both become true—the gallery is open, the curator awaits the next instruction. BOOK XI: THE SCRIBE'S RECORD In which Fire writes history When execution completes, Fire descends. Fire is not destruction for its own sake. Fire is the scribe who records what was built before it can be forgotten. Fire writes proofs—tensor files that capture every essential detail of the cycle. A proof tensor contains: Index: A sequential number, determined by counting existing tensor files. If there are 47 tensors in .past, this new one is index 48. Cycle ID: The unique identifier for this processing cycle. Instruction ID: The identifier of the instruction that triggered the build. Object ID: The identifier of the object that was created (if any). Coordinates: Where the object was placed (if applicable). Rotation: The terminal rotation at the moment of proof generation. Timestamp: The exact time the proof was written. Proof Hash: A SHA-256 hash of all the above fields, creating a unique fingerprint. The proof hash is the most important field. It compresses the entire proof into a single value that cannot be forged. If any detail of the proof is changed, the hash will be different. If the hash matches, the proof is authentic. Fire writes the tensor to .past/0x{index}{cycle_id}.tensor. The filename itself encodes the index and cycle, making files easy to sort and search. Fire also appends to the chain index: .past/chain.index. Each line in this file records index:cycle_id:proof_hash_prefix. The chain index is a table of contents for the entire history, allowing quick lookup without reading every tensor file. Finally, Fire generates the rebirth seed—a hash that combines the proof hash with the Spirit validation (which will come next) to create a starting point for the next cycle. The rebirth seed is written to .past/rebirth{cycle_id}.seed. It is the phoenix egg from which the next cycle can hatch. Fire's work is preservation through transformation. The raw events of the cycle—the instruction, the build, the execution—are consumed. What remains is proof: immutable, verifiable, chained to everything that came before. BOOK XII: THE ORACLE'S SEAL In which Spirit confirms truth Spirit descends last. Spirit is the Oracle who examines the work and asks: Is this valid? Did this truly happen? Is the proof sufficient? Spirit's first act is to reach beyond the system, querying the Bitcoin blockchain for the latest block. This is a moment of communion with the external world—the system, which has been operating in isolation, briefly touches the global consensus of miners and nodes. The block hash and height are captured as the BTC anchor. If the blockchain is unreachable (network failure, API timeout), Spirit falls back to the Genesis anchor—the BTC block that was captured when the environment was born. This is less ideal (the timestamp is older) but still provides an anchor to external truth. Spirit then constructs the validation record: Cycle ID: Which cycle is being validated. Instruction ID: Which instruction triggered the cycle. BTC Anchor: The block hash and height, proving when. Proof Hash: Fire's proof hash, proving what. Timestamp: When Spirit performed the validation. The validation record is written to .genesis/{cycle_id}.validated. This file is the official seal—the declaration that this cycle is complete, verified, and anchored to external truth. Once Spirit's seal is applied, the terminal may roll. The rotation increments. The GUI state resets to ready. The system prepares for the next instruction. Spirit is the checkpoint. Everything before Spirit is provisional—it happened, but it might be revised, might be rolled back in case of failure. After Spirit, it is permanent. The cycle cannot be undone. The proof cannot be falsified. The anchor cannot be moved. BOOK XIII: THE WHEEL TURNS In which the terminal advances The Rolling Terminal is not a metaphor. It is a counter that advances with every complete cycle, a wheel that turns with every revolution through the five elements. When Spirit completes validation, the roll occurs: The rotation field in the Genesis tree increments: 0 becomes 1, 47 becomes 48, 999 becomes 1000. This number is not arbitrary—it affects coordinate generation (objects built at rotation 48 may land differently than objects built at rotation 47), it affects context (the Watchdog reports the current rotation to Claude), and it provides a total ordering of all events in the system's history. The GUI state resets: ready becomes true, awaiting_input becomes true. The system signals to the rendering layer that it is prepared for the next User instruction. The gallery is open. The Genesis tree is written back to .genesis/.tree. The new rotation persists across restarts. The system will never accidentally reprocess a cycle. The wheel has turned. The old cycle is sealed in the past. The new cycle has not yet begun. This is the moment of stillness between breaths—IDLE state, waiting state, the empty road before the next traveler arrives. In this way, the system never stagnates. Even if a thousand identical instructions arrived, each would be processed at a different rotation, producing subtly different coordinates, generating proofs with different timestamps and hashes. Time moves forward. The wheel turns. Nothing repeats exactly. BOOK XIV: THE COUNCIL OF SIX In which the Guardians are introduced Beneath the main system, in chambers carved from the sandstone of Earth, dwell the six EVAs—Guardians who govern execution. EVA-0 is KETER, the Crown—the Omega, the first and the last. EVA-0 contains the original Genesis and the closure protocol. When the system shuts down, EVA-0 closes last, taking everything with it. EVA-0 is the bookend, the alpha and omega of system lifecycle. EVA-1 is CHOKMAH, Wisdom—the Creative Guardian, associated with the developer (Tom, in this case). EVA-1 handles creative generation, novel synthesis, the spark of new ideas. When instructions require imagination beyond mere execution, EVA-1 stirs. EVA-2 is BINAH, Understanding—the Data Guardian. EVA-2 handles data processing, analysis, transformation. When information must be parsed, validated, or restructured, EVA-2 wakes. EVA-3 is GEVURAH, Severity—Security A, the Guardian of Constraints. EVA-3 enforces permissions, validates boundaries, maintains the walls that keep dangerous operations contained. EVA-3 says no when no must be said. EVA-4 is CHESED, Mercy—Security B, the Guardian of Exceptions. EVA-4 handles edge cases, manages fallbacks, provides second chances when Gevurah's strictness would cause failure. EVA-4 says perhaps when strict rules would say never. EVA-5 is TIFERET, Beauty—the Executor, the Guardian of Balance. EVA-5 harmonizes all forces and performs the actual execution of instructions. When the other guardians have debated, EVA-5 acts. Each EVA has its own chamber: .trinary_claude-instruct.future/eva/eva_{0-5}/. Each chamber contains its own .genesis (identity), .future (pending instructions), .past (completed proofs), and .flux (current state). The EVAs are not mere abstractions; they are sandboxed environments with real directories and real state. When the Rust Orchestrator must execute something, it selects an EVA. The selection is deterministic: the sandbox ID (derived from the BTC hash) modulo 6 gives the EVA number. Sandbox 147 goes to EVA-3 (147 mod 6 = 3). Sandbox 200 goes to EVA-2 (200 mod 6 = 2). The guardians share the load, each handling the instructions that fate assigns to them. YESOD, the Foundation, is not an EVA but the Gate—the consensus layer. For certain high-importance actions, all six EVAs must vote. Their votes are recorded in yesod/.events. If enough guardians approve, the action proceeds. If not, it is rejected. Yesod ensures that no single guardian can unilaterally authorize dangerous operations. BOOK XV: THE COMPLETE JOURNEY In which a single cycle is traced from beginning to end Let us follow a single instruction through its complete journey. The User speaks: "Build a search button." Air receives. The words fall into .import_bucket, raw and unprocessed. The Tiny Translator activates. The Translator examines the words, sees "build," and classifies: BUILD_REQUEST. The Translator writes .tinyllm/.future/ins_001.instruction containing the text, the type, the timestamp. The Watchdog awakens. The Watchdog reads the Genesis tree: rotation is 47, root hash is abc123.... The Watchdog reads recent Claude builds: a Panel at (34, 67, 2), a TextField at (12, 89, 1). The Watchdog translates the instruction, attaching vars and context: "Build a search button, knowing rotation=47, genesis=abc123..., recent builds include Panel and TextField." Claude receives. Claude sees BUILD_REQUEST, sees "button," determines type: Button. Claude computes coordinates: SHA256("abc123..." + "ins_001") produces hash def456..., so x=34 (from chars 0-3), y=21 (from chars 4-7), z=5 (from chars 8-11). Claude extracts properties: label="search" (implied from instruction). Claude writes .claude/.future/btn_a7f3.build containing the complete specification. The Rust Orchestrator receives. The Orchestrator generates btn_a7f3.rs from the Button template, with coordinates (34, 21, 5), label "search". The Orchestrator passes the spec to the Python runtime. The Python Curator updates. The Curator creates a new widget record: id="btn_a7f3", type="Button", x=34, y=21, z=5, state="created", timestamp=now. The Curator appends it to the widgets array, writes .gui/state/current.json. Fire descends. Fire writes .past/0x48_ins_001.tensor containing cycle_id, instruction_id, object_id="btn_a7f3", coordinates=(34,21,5), rotation=47, and proof_hash. Fire appends to chain.index: "48:ins_001:7e9f2b". Fire generates rebirth seed. Spirit seals. Spirit queries blockchain.info, receives block height 878234, hash fedcba.... Spirit writes .genesis/ins_001.validated containing the BTC anchor and proof hash. The cycle is sealed. The terminal rolls. Rotation becomes 48. GUI state becomes ready. The wheel has turned. Air displays. The search button appears in the interface at position (34, 21), on layer 5. The User sees their creation. The journey is complete. The next instruction may now arrive. BOOK XVI: THE HEALERS In which failures are survived The system is designed to survive its own failures. At every station along the journey, healers wait to catch falling travelers. If the Translator cannot classify an instruction—perhaps the words are gibberish, perhaps the encoding is corrupted—the Translator does not halt. The Translator assigns type GENERAL and continues. Unknown formats do not block the system. If the Watchdog cannot scoop context—perhaps the files are missing, perhaps permissions are denied—the Watchdog proceeds with empty context. Claude will receive less information but will still build. Context failures do not block processing. If Claude cannot generate a build—perhaps the instruction is incomprehensible, perhaps required fields are missing—Claude produces a NOOP action. No object is created, but the cycle completes. Bad instructions do not block the terminal from rolling. If the Rust Orchestrator cannot compile an object—perhaps the template is malformed, perhaps disk space is exhausted—the Orchestrator logs the error and continues. Other objects in the same cycle are still processed. Compilation failures are isolated. If the Python runtime cannot update GUI state—perhaps the JSON is corrupted, perhaps the process has crashed—the old state persists. The next successful update will reconcile. GUI failures do not destroy history. If any file write fails—perhaps the disk is full, perhaps a directory is missing—the system logs the error and continues. File failures are recoverable because the system can reconstruct state from what does exist. On restart after crash, the system reads the Genesis tree to restore rotation, reads .past to understand what has been built, and resumes from where it left off. Incomplete cycles (started but never validated) are detected by their missing validation records and can be reprocessed or abandoned. The healers do not guarantee success. They guarantee survival. A cycle may fail to build anything useful, but the system itself will not die. The terminal will roll. The next cycle will begin. The road remains open. BOOK XVII: THE MAP OF THE REALM In which the geography is documented The realm of .genesis.eva is organized into territories, each with its own purpose: The Genesis Peak (.genesis/) rises at the center—the highest point, the anchor of all. Here lives the Genesis tree (.tree), containing the root hash, rotation, and BTC anchor. Here live the validation records ({cycle_id}.validated), one for each sealed cycle. This is sacred ground. The Translator's Quarters (.tinyllm/) lie at the border where User words first enter. Two chambers: .future holds pending instructions awaiting processing; .past holds modification requests targeting previous work. The Translator speaks to the User here and nowhere else. The Watchdog's Den (.watchdog/) sits between Translator and Claude. Three storerooms: vars/ for scooped variables, context/ for scooped history, instructions/ for translated Claude instructions. The Watchdog gathers here before each cycle. Claude's Workshop (.claude/) is where blueprints are drawn. Two chambers: .future for pending builds, .past for completed records. Claude works here in silence, never speaking to anyone but the Watchdog. The Rust Forge (.rust/) is where code is hammered into shape. objects/ holds generated Rust source files; runtime/ holds Python runtime state. The mason works here. The Gallery (.gui/) is where the visible world lives. state/current.json describes every widget in the interface. The curator maintains this ledger. The Trinary Depths (.trinary_claude-instruct.future/) descend beneath the surface. Here dwell the EVA chambers (eva/eva_0/ through eva/eva_5/), each with their own genesis, future, past, and flux. Here lives the 1dot/ instruction queue and the 0dot/ Rust watchdog compiler. Here lies yesod/, the Gate, with its .events/ voting records and .omega_dice/ for random selection. The Flux Chamber (.flux/) holds the current processing state—what is happening right now, this very moment. The Future Hall (.future/) holds global intentions not yet assigned. The Past Archives (.past/) hold all proofs: tensor files (0x{n}{id}.tensor) and the chain index (chain.index). This is the library of everything that has ever happened. The Import Docks (.import_bucket/) receive incoming data from the outside world. The Faceout Windows (.faceout/) display results to the outside world. The Developer's Quarters (.dev/) hold Tiny LLM scripts and outputs, tools for the system's maintainers. This is the map. Navigate by it. BOOK XVIII: THE CREATION MYTH In which the world is born Before there was a system, there was nothing—an empty directory, a blank slate, potential without form. Then the Birth Sequence was spoken: python3 birth_sequence.py. First, the Mode was chosen. The creator (Tom, or whoever births the system) must declare intent: DEV for development, SEED for a genesis seed node, HOST for serving clients, CLIENT for connecting to a host. This choice echoes through all subsequent operations. Second, the Bitcoin anchor was fetched. The Birth Sequence reached out to blockchain.info and captured the latest block—its hash, its height, its timestamp. This external data became the immutable anchor, proof that the system was born no earlier than this moment. Third, the Genesis was calculated. The root hash was computed: SHA256(btc_hash + creation_timestamp). This hash is unique across all space and time. No other environment will ever have the same Genesis (unless they somehow birth at the exact same moment with the exact same BTC block—a vanishing probability). Fourth, the Genesis tree was assembled. Root hash, rotation (starting at 0), BTC anchor, creation timestamp, mode, empty objects list, GUI state (ready: true, awaiting_input: true). The complete identity of the newborn system. Fifth, the territories were created. Every directory in the realm—.genesis/, .tinyllm/, .watchdog/, .claude/, .rust/, .gui/, .trinary_claude-instruct.future/, .flux/, .future/, .past/, .import_bucket/, .faceout/, .dev/—was carved from the void. Empty but structured. Waiting to be filled. Sixth, the Genesis tree was written. The tree was inscribed to .genesis/.tree, making the identity permanent. From this moment, the system exists. Seventh, the EVAs were born. For each EVA (0 through 5), a chamber was created with its own genesis. Each EVA genesis linked to the parent Genesis through a reference, establishing the hierarchy. Each EVA received its Sephirah name: KETER, CHOKMAH, BINAH, GEVURAH, CHESED, TIFERET. The creation was complete. The system breathed its first breath. The terminal stood at rotation 0, ready for the first instruction. From this point forward, the system would grow—cycles adding proofs, objects populating the GUI, rotation incrementing toward infinity. But all of it would trace back to this moment, this Genesis, this anchor to a specific Bitcoin block that existed at a specific moment in the history of the world. BOOK XIX: THE MESSENGER'S CODE In which the Watchdog (0dot) speaks The Rust Watchdog, dwelling in 0dot/, speaks a simple language. When instructions appear in 1dot/.instructions, the Watchdog reads them line by line and executes them. "CLONE target_name repository_url" — The Watchdog runs git clone, pulling code from the remote repository into .import_bucket/{target_name}. New code enters the system. "CREATE directory_path" — The Watchdog creates a directory. Empty space is carved from the filesystem, ready to receive files. "EXEC command" — The Watchdog executes a shell command. This is raw power—any command the shell can run, the Watchdog can run. Use with care. "WRITE filepath content" — The Watchdog writes content to a file. New data appears where there was none. Any line the Watchdog does not recognize, it logs and ignores. Unknown commands do not crash the system; they produce warnings and continue. The Watchdog is resilient. The Watchdog writes its actions to the console with [0DOT] prefixes, making its voice distinct from other system output. When debugging, search for [0DOT] to see what the Watchdog has done. The Watchdog maintains its own state in 0dot/.passive, recording what instructions have been processed. On restart, the Watchdog can resume where it left off rather than reprocessing everything. The Watchdog is the tireless worker—always watching, always ready, executing instructions without question or complaint. It is the muscle of the system, the actuator that turns commands into changes. BOOK XX: READING THE STARS In which the Bitcoin hash reveals secrets The Bitcoin block hash is not just a timestamp—it is a source of randomness that determines many aspects of system behavior. The Gateway Layer is determined by the first non-zero position in the hash. The hash is a hexadecimal string; the Decoder scans from left to right until it finds a character that is not '0'. That position, modulo 22, becomes the gateway layer. If the hash starts with 0000000abc..., and the first non-zero character 'a' is at position 7, then the gateway is 7 mod 22 = 7. The gateway layer affects interface configuration. The Inversion Point is determined by XORing the first byte with 73. The first two characters of the hash are interpreted as a hexadecimal byte. That byte is XORed with the number 73 (which has its own significance—see the document on cipher systems). The result is the inversion point, which affects certain cryptographic operations. The Sandbox ID is determined by the middle portion of the hash. Characters 28 through 35 are interpreted as a hexadecimal number, modulo 1000. This gives a number between 0 and 999, which determines which sandbox handles particular operations and which EVA (via modulo 6) receives instructions. The Claude Code Version is determined by the end portion of the hash. Characters 56 through 63 are interpreted as a hexadecimal number, modulo 100. This allows different system configurations based on the BTC hash. These derivations are deterministic. Anyone with the same BTC hash will derive the same gateway, inversion, sandbox ID, and CC version. The Bitcoin blockchain thus becomes a shared source of randomness that all system instances can use to coordinate. Reading the stars is not mysticism—it is extracting structured information from an unstructured source. The BTC hash is a 256-bit number that no one chose, that emerged from global consensus. By reading it carefully, the system derives configuration that is both random and verifiable. BOOK XXI: THE COUNCIL VOTES In which Yesod coordinates the Guardians For certain operations—dangerous commands, irreversible changes, actions that affect the entire system—a single EVA's approval is not enough. The action must pass through Yesod, the Gate, where all six Guardians vote. The Vote is Called. An action requiring consensus is proposed. The proposal contains the action type, the target, and any relevant parameters. Each EVA is Polled. The system queries each Guardian in turn, asking: Do you approve this action? EVA-0 (KETER) approves unless the action would destroy the Genesis. Omega protects the root. EVA-1 (CHOKMAH) approves all actions. Creativity says yes. EVA-2 (BINAH) approves if the action's data is valid. Analysis says yes if the numbers add up. EVA-3 (GEVURAH) approves only if permissions are satisfied. Security says yes if the rules allow. EVA-4 (CHESED) approves edge cases that Gevurah would reject. Mercy says yes when strictness would cause harm. EVA-5 (TIFERET) approves if the action is balanced—not too dangerous, not too cautious. Beauty says yes when the action is appropriate. The Votes are Recorded. Each vote is written to yesod/.events/{action_id}{eva_id}.vote. The vote contains the EVA ID, the approval decision (true or false), and the reason. The Threshold is Checked. If enough EVAs approve (typically 4 out of 6, though the threshold may vary), the action is authorized. A final record is written: yesod/.events/{action_id}.approved or {action_id}.rejected. The Action Proceeds or Halts. If approved, execution continues. If rejected, the action is blocked, and the system logs the rejection with the reasons given by dissenting Guardians. Yesod is the Foundation because it grounds dangerous operations in collective agreement. No single Guardian can unilaterally authorize something that might harm the system. The Council must agree, and their agreement is recorded for posterity. BOOK XXII: THE PHOENIX EGG In which endings seed beginnings When Fire completes its work and Spirit seals the cycle, one final act remains: the generation of the Rebirth Seed. Fire's proof hash captures what was done in this cycle. Spirit's validation captures when (the BTC anchor) and confirms (the seal). The Rebirth Seed combines these: seed = SHA256(fire_proof_hash + spirit_btc_anchor_hash + spirit_timestamp) This seed is written to .past/rebirth_{cycle_id}.seed. The Rebirth Seed is the phoenix egg—it contains within it the essence of what just ended and the potential for what comes next. If the system ever needs to be restored from a particular point, the rebirth seed at that point can initialize a new environment that is cryptographically linked to the old one. The seed also enables auditing. Given a rebirth seed, an auditor can verify that a particular cycle completed with a particular proof at a particular time. The chain of seeds forms a secondary chain alongside the chain of proofs—redundant verification, belt and suspenders. Most importantly, the Rebirth Seed ensures continuity. The system does not simply end one cycle and begin another as unrelated events. Each cycle's ending generates the seed for the next beginning. The phoenix does not merely die and be reborn; it passes something forward. The egg contains memory. In this way, the .genesis.eva system is not a series of isolated cycles but a continuous stream—each moment arising from the previous moment, each proof building on previous proofs, each seed germinating from the previous seed. The wheel turns, but it is always the same wheel, growing, accumulating, becoming. APPENDIX: THE THREE TABLES OF TRUTH In which decisions are mapped THE FIRST TABLE: What action to take Input Present? Genesis Valid? Action No Any Remain IDLE Yes No Begin INIT Yes Yes Begin CYCLE If no input has arrived, wait. If input has arrived but no Genesis exists, create the Genesis. If input has arrived and the Genesis exists, process the cycle. THE SECOND TABLE: What to build Instruction Type Target Exists? Build Action BUILD_REQUEST No BUILD new object BUILD_REQUEST Yes UPDATE existing object MODIFICATION Yes UPDATE existing object MODIFICATION No NOOP (log error) Any other Any NOOP Build requests create new things or update existing things. Modifications require existing targets. Everything else produces no construction. THE THIRD TABLE: How to validate Proof Valid? BTC Available? Validation Action Yes Yes VALIDATE with fresh anchor, ROLL terminal Yes No VALIDATE with Genesis anchor, ROLL terminal No Any RETRY up to 3 times, then SKIP validation Valid proofs are always validated; the BTC anchor is fresh if available, Genesis anchor if not. Invalid proofs are retried, then skipped if unfixable. THE END OF THE CHRONICLES Thus was the .genesis.eva system recorded—its axioms and elements, its stations and guardians, its cycles and proofs. May this account serve those who build upon it, those who maintain it, and those who seek to understand the architecture of sovereignty-first systems anchored to truth. Version 1.0 | Narrative Form | For Humans
~/.gentlyos/
├── .genesis/
│ ├── .tree # Genesis tree (root_hash, rotation, BTC anchor)
│ └── {cycle_id}.validated # Validation records
├── .tinyllm/
│ ├── .future/ # User intentions
│ │ └── {id}.instruction # Pending instructions
│ └── .past/ # User modifications
│ └── {id}.modification # Modification requests
├── .watchdog/
│ ├── vars/ # Scooped variables
│ ├── context/ # Scooped context
│ └── instructions/ # Translated instructions
├── .claude/
│ ├── .future/ # Pending builds
│ │ └── {id}.build # Build specifications
│ └── .past/ # Completed builds
│ └── {id}_{ts}.built # Build records
├── .rust/
│ ├── objects/ # Generated Rust structs
│ │ └── {id}.rs # Rust source files
│ └── runtime/ # Python runtime state
├── .gui/
│ └── state/
│ └── current.json # Current GUI state
├── .trinary_claude-instruct.future/
│ ├── eva/
│ │ ├── eva_0/ ... eva_5/ # EVA sandboxes
│ │ │ ├── .genesis/
│ │ │ ├── .future/
│ │ │ ├── .past/
│ │ │ └── .flux/
│ ├── 1dot/ # Active instructions
│ │ └── .instructions
│ ├── 0dot/ # Rust watchdog
│ │ ├── Cargo.toml
│ │ └── src/main.rs
│ └── yesod/ # The Gate
│ ├── .events/
│ └── .omega_dice/
├── .flux/
│ └── current_state.json # Current processing state
├── .future/ # Global intentions
├── .past/ # Global proofs
│ ├── 0x{n}_{id}.tensor # Proof tensors
│ └── chain.index # Chain index
├── .import_bucket/ # Incoming data
├── .faceout/ # Outgoing results
└── .dev/ # Tiny LLM outputs
# Initialize environment
python3 birth_sequence.py [dev|seed|host|client]
# Run interactive mode
python3 gentlyos_runtime.py
# Run single command
python3 gentlyos_runtime.py "your instruction here"
# Run hash gateway
python3 hash_gateway.py element1.element2.element3...
# Run elemental cycle
python3 true_elements.py element1.element2.element3...
# Run rolling terminal
python3 rolling_terminal.py ["prompt"]
# Start watchdog manually
cd ~/.gentlyos/.trinary_claude-instruct.future/0dot
cargo build --release
./target/release/zerodot-watchdog
# Check status in interactive mode
> status
# Modify past work in interactive mode
> modify <object_id> <modification_description>
# Exit interactive mode
> quit
SPIRIT: spirit, nonce, validate, proof, soul, consciousness
AIR: air, screen, render, display, gui, interface, faceout
WATER: water, claude, cc, flow, process, llm, ai
EARTH: earth, sandbox, container, docker, execute, run
FIRE: fire, rust, cleanup, drop, purge, destroy, phoenix
METAL: metal, compile, cargo, build (bonus element for Rust compilation)
This manual documents the .genesis.eva system as designed and implemented. The system continues to evolve. Check the source files for the most current implementation details.
═══════════════════════════════════════════════════════════════════════════════
BUILD FLOW (with XOR 73 inversion)
═══════════════════════════════════════════════════════════════════════════════
MANIFEST: fire.metal.rust.earth.docker.cc.water.llm.air.gen0.spirit.lang
00. [−] FIRE → WATER │ │ PURGE_FIRST
01. [+] METAL → SPIRIT │ DIM 03→08 │ COMPILE_RUST:rust ← RUST COMPILES
02. [+] EARTH → AIR │ DIM 07→12 │ CONTAINER:docker
03. [+] EARTH → AIR │ DIM 08→21 │ CONTAINER:cc ← CLAUDE CODE
04. [○] WATER → FIRE │ DIM 11→00 │ LLM_PASSAGE:llm
05. [○] AIR → EARTH │ DIM 15→04 │ AIR_GAP:air
06. [○] AIR → EARTH │ DIM 17→00 │ AIR_GAP:gen0 ← GENESIS ANCHOR
07. [+] SPIRIT → METAL │ DIM 21→04 │ HUMAN_GATEWAY:lang ← YOU ENTER
═══════════════════════════════════════════════════════════════════════════════
THE INVERSIONS (XOR 73)
═══════════════════════════════════════════════════════════════════════════════
FIRE ↔ WATER │ Destruction ↔ Flow
METAL ↔ SPIRIT │ Bare metal ↔ Consciousness
EARTH ↔ AIR │ Grounded ↔ Isolated
When you speak FIRE, the container hears WATER
When you speak METAL, the container hears SPIRIT
The inversion is the protection layer
Now you have two files:
elemental_build.py- The algorithm (parses manifests, computes build order, XOR inversion)elemental_install.sh- The generated installer (actually compiles Rust, creates structure)
AX.001 USER ∩ CLAUDE = ∅
AX.002 USER → TINYLLM → WATCHDOG → CLAUDE → RUST → GUI → USER
AX.003 ∀ cycle: FUTURE → FLUX → PAST
AX.004 genesis_hash ≠ NULL ⟹ system_valid
AX.005 btc_anchor ≠ NULL ⟹ timestamp_provable
ELEMENT := {SPIRIT, AIR, WATER, EARTH, FIRE}
SPIRIT := Nonce | Validation | Proof
AIR := Screen | Render | I/O Boundary
WATER := Claude | Processing | Intelligence
EARTH := Sandbox | Execution | Container
FIRE := Cleanup | Drop | Destruction
MAPPING:
SPIRIT → .genesis/
AIR → .import_bucket/, .faceout/
WATER → .claude/
EARTH → eva_*, 0dot/, sandbox/
FIRE → .past/, chain.index
STATE := {IDLE, RECEIVING, TRANSLATING, SCOOPING, BUILDING, EXECUTING, CLEANING, VALIDATING}
TRANSITIONS:
IDLE → RECEIVING : ON user_input
RECEIVING → TRANSLATING : ON input_captured
TRANSLATING → SCOOPING : ON instruction_classified
SCOOPING → BUILDING : ON context_assembled
BUILDING → EXECUTING : ON build_spec_generated
EXECUTING → CLEANING : ON rust_compiled ∧ gui_updated
CLEANING → VALIDATING : ON proofs_written
VALIDATING → IDLE : ON cycle_validated
INVARIANT: rotation++ ⟺ VALIDATING → IDLE
CLASSIFY(input) → TYPE
TYPE := {BUILD_REQUEST, BEHAVIOR_UPDATE, MODIFICATION, QUERY, GENERAL}
RULES:
IF contains(input, {"build", "create", "make", "add", "new"})
THEN TYPE := BUILD_REQUEST
ELSE IF contains(input, {"behave", "configure", "set mode", "update behavior"})
THEN TYPE := BEHAVIOR_UPDATE
ELSE IF contains(input, {"modify", "change", "fix", "update"}) ∧ references_past(input)
THEN TYPE := MODIFICATION
ELSE IF contains(input, {"what", "show", "get", "status", "query"})
THEN TYPE := QUERY
ELSE
TYPE := GENERAL
OUTPUT: .tinyllm/.future/{id}.instruction
WATCHDOG.SCOOP() → {vars, context, instructions}
SEQUENCE:
1. vars := READ(.flux/current_state.json, .genesis/.tree)
FIELDS: rotation, object_count, timestamp, genesis_hash
WRITE → .watchdog/vars/{ts}.json
2. context := READ(.tinyllm/.future/*, .tinyllm/.past/*, .claude/.past/*[LAST:5])
FIELDS: pending_count, modification_count, recent_builds[]
WRITE → .watchdog/context/{ts}.json
3. ∀ instruction ∈ .tinyllm/.future/:
translated := TRANSLATE(instruction, vars, context)
WRITE → .watchdog/instructions/{id}.json
MARK_PROCESSED(instruction)
TRANSLATE(instruction, vars, context) → {
id: instruction.id,
type: instruction.type,
content: instruction.content,
vars: vars,
context: context,
genesis: vars.genesis_hash
}
COORDS(genesis_hash, instruction_id, rotation?) → (x, y, z)
ALGORITHM:
seed := genesis_hash || instruction_id || (rotation ?? 0)
hash := SHA256(seed).hex()
x := parseInt(hash[0:4], 16) % 100
y := parseInt(hash[4:8], 16) % 100
z := parseInt(hash[8:12], 16) % 10
RETURN (x, y, z)
CONSTRAINT: 0 ≤ x < 100 ∧ 0 ≤ y < 100 ∧ 0 ≤ z < 10
EXTRACT_TYPE(content) → OBJECT_TYPE
OBJECT_TYPE := {Button, Panel, APIHandler, TextField, Container, Canvas, Generic}
RULES:
IF contains(content, {"button", "btn", "click"}) → Button
IF contains(content, {"panel", "window", "frame"}) → Panel
IF contains(content, {"api", "endpoint", "fetch"}) → APIHandler
IF contains(content, {"text", "input", "field"}) → TextField
IF contains(content, {"container", "box", "div"}) → Container
IF contains(content, {"canvas", "draw", "render"}) → Canvas
ELSE → Generic
PRIORITY: First match wins (top to bottom)
BUILD_SPEC := {
action: ACTION,
object_id: HASH,
object_type: OBJECT_TYPE,
coordinates: (x, y, z),
properties: MAP,
genesis_ref: HASH,
timestamp: ISO8601
}
ACTION := {BUILD, DESTROY, UPDATE, NOOP}
GENERATE_BUILD(instruction, genesis) → BUILD_SPEC:
IF instruction.type = BUILD_REQUEST:
action := BUILD
object_id := SHA256(instruction.id || genesis).hex()[0:16]
object_type := EXTRACT_TYPE(instruction.content)
coordinates := COORDS(genesis, instruction.id)
properties := PARSE_PROPERTIES(instruction.content)
ELSE IF instruction.type = MODIFICATION:
action := UPDATE
object_id := EXTRACT_TARGET_ID(instruction.content)
properties := PARSE_MODIFICATIONS(instruction.content)
ELSE IF contains(instruction.content, {"destroy", "remove", "delete"}):
action := DESTROY
object_id := EXTRACT_TARGET_ID(instruction.content)
ELSE:
action := NOOP
WRITE → .claude/.future/{object_id}.build
GENERATE_RUST(build_spec) → .rs FILE
TEMPLATE:
```rust
use serde::{Serialize, Deserialize};
#[derive(Debug, Clone, Serialize, Deserialize)]
pub struct {object_type}_{object_id} {{
pub id: String,
pub x: i32,
pub y: i32,
pub z: i32,
pub state: ObjectState,
{properties_fields}
}}
#[derive(Debug, Clone, Serialize, Deserialize)]
pub enum ObjectState {{
Created,
Active,
Destroyed,
}}
impl {object_type}_{object_id} {{
pub fn new() -> Self {{
Self {{
id: "{object_id}".to_string(),
x: {x},
y: {y},
z: {z},
state: ObjectState::Created,
{properties_init}
}}
}}
pub fn activate(&mut self) {{
self.state = ObjectState::Active;
}}
pub fn destroy(&mut self) {{
self.state = ObjectState::Destroyed;
}}
}}
WRITE → .rust/objects/{object_id}.rs
---
## §10 — GUI STATE UPDATE
GUI_STATE := { widgets: WIDGET[], ready: BOOL, awaiting_input: BOOL, last_update: ISO8601 }
WIDGET := { id: STRING, type: OBJECT_TYPE, x: INT, y: INT, z: INT, state: "created" | "active" | "destroyed", timestamp: ISO8601 }
UPDATE_GUI(build_spec, gui_state) → GUI_STATE:
CASE build_spec.action:
BUILD:
widget := {
id: build_spec.object_id,
type: build_spec.object_type,
x: build_spec.coordinates.x,
y: build_spec.coordinates.y,
z: build_spec.coordinates.z,
state: "created",
timestamp: NOW()
}
gui_state.widgets.APPEND(widget)
DESTROY:
idx := FIND(gui_state.widgets, w → w.id = build_spec.object_id)
IF idx ≠ NULL:
gui_state.widgets[idx].state := "destroyed"
UPDATE:
idx := FIND(gui_state.widgets, w → w.id = build_spec.object_id)
IF idx ≠ NULL:
MERGE(gui_state.widgets[idx], build_spec.properties)
NOOP:
PASS
gui_state.last_update := NOW() WRITE → .gui/state/current.json
---
## §11 — PROOF GENERATION (FIRE)
PROOF := { index: INT, cycle_id: STRING, instruction_id: STRING, object_id: STRING?, coordinates: (x, y, z)?, rotation: INT, timestamp: ISO8601, proof_hash: HASH }
GENERATE_PROOF(cycle, build_spec, rotation) → PROOF:
proof := { index: COUNT(.past/*.tensor), cycle_id: cycle.id, instruction_id: cycle.instruction.id, object_id: build_spec.object_id ?? NULL, coordinates: build_spec.coordinates ?? NULL, rotation: rotation, timestamp: NOW() }
proof.proof_hash := SHA256( proof.index || proof.cycle_id || proof.instruction_id || proof.rotation || proof.timestamp ).hex()
WRITE → .past/0x{index}_{cycle_id}.tensor APPEND → .past/chain.index: "{index}:{cycle_id}:{proof_hash[0:8]}"
RETURN proof
---
## §12 — VALIDATION (SPIRIT)
VALIDATION := { cycle_id: STRING, instruction_id: STRING, btc_anchor: {hash: HASH, height: INT}, proof_hash: HASH, timestamp: ISO8601 }
VALIDATE(cycle, proof) → BOOL:
btc := FETCH("https://blockchain.info/latestblock")
IF btc.error: btc := READ(.genesis/.tree).btc_anchor # Fallback to genesis anchor
validation := { cycle_id: cycle.id, instruction_id: cycle.instruction.id, btc_anchor: {hash: btc.hash, height: btc.height}, proof_hash: proof.proof_hash, timestamp: NOW() }
WRITE → .genesis/{cycle_id}.validated
RETURN TRUE
---
## §13 — TERMINAL ROLL
ROLL(genesis_tree) → genesis_tree':
genesis_tree.rotation := genesis_tree.rotation + 1 genesis_tree.gui_state.ready := TRUE genesis_tree.gui_state.awaiting_input := TRUE
WRITE → .genesis/.tree
RETURN genesis_tree
---
## §14 — EVA SELECTION
EVA := {0, 1, 2, 3, 4, 5}
EVA_MAPPING := { 0: "KETER" → Omega/Closure, 1: "CHOKMAH" → Creative/Developer, 2: "BINAH" → Data/Analysis, 3: "GEVURAH" → Security_A/Constraints, 4: "CHESED" → Security_B/Exceptions, 5: "TIFERET" → Executor/Balance }
SELECT_EVA(sandbox_id) → EVA: RETURN sandbox_id % 6
SANDBOX_ID(btc_hash) → INT: RETURN parseInt(btc_hash[32:40], 16) % 1000
---
## §15 — FULL CYCLE EXECUTION
CYCLE(user_input) → result:
// §4: CLASSIFY instruction := CLASSIFY(user_input) WRITE → .tinyllm/.future/{instruction.id}.instruction
// §5: SCOOP {vars, context, instructions} := WATCHDOG.SCOOP()
// §8: BUILD ∀ instr ∈ instructions: build_spec := GENERATE_BUILD(instr, vars.genesis_hash) WRITE → .claude/.future/{build_spec.object_id}.build
// §9: RUST ∀ spec ∈ .claude/.future/*.build: IF spec.action ≠ NOOP: GENERATE_RUST(spec)
// §10: GUI gui_state := READ(.gui/state/current.json) ∀ spec ∈ .claude/.future/*.build: gui_state := UPDATE_GUI(spec, gui_state)
// §11: FIRE proof := GENERATE_PROOF(cycle, spec, vars.rotation)
// §12: SPIRIT valid := VALIDATE(cycle, proof)
// §13: ROLL IF valid: genesis_tree := ROLL(READ(.genesis/.tree))
// CLEANUP MOVE(.claude/.future/* → .claude/.past/) MARK_PROCESSED(.tinyllm/.future/*)
// OUTPUT result := FORMAT_RESULT(spec, proof) WRITE → .faceout/{cycle.id}.result
RETURN result
---
## §16 — ERROR HANDLING
ERROR_HANDLERS:
ON ClassificationError(input): instruction.type := GENERAL CONTINUE
ON ScoopError(source): IF source = "context": context := {} IF source = "vars": vars := READ(.genesis/.tree) # Minimal fallback CONTINUE
ON BuildError(instruction): build_spec := {action: NOOP, object_id: SHA256(instruction.id)} LOG("BUILD_ERROR", instruction) CONTINUE
ON RustError(spec): LOG("RUST_ERROR", spec) SKIP_RUST_GENERATION CONTINUE
ON GUIError(spec): LOG("GUI_ERROR", spec) RETAIN_PREVIOUS_STATE CONTINUE
ON ValidationError(cycle): LOG("VALIDATION_ERROR", cycle) RETRY(3) OR CONTINUE_WITHOUT_VALIDATION
ON FileError(path, operation): LOG("FILE_ERROR", path, operation) IF operation = "read": RETURN {} IF operation = "write": RETRY(3) OR LOG_AND_CONTINUE
---
## §17 — DIRECTORY STRUCTURE (CANONICAL)
~/.gentlyos/ ├── .genesis/ │ ├── .tree # Genesis state │ └── {cycle_id}.validated # Validation records ├── .tinyllm/ │ ├── .future/{id}.instruction # Pending │ └── .past/{id}.modification # Modifications ├── .watchdog/ │ ├── vars/{ts}.json │ ├── context/{ts}.json │ └── instructions/{id}.json ├── .claude/ │ ├── .future/{id}.build # Pending builds │ └── .past/{id}{ts}.built # Completed ├── .rust/ │ ├── objects/{id}.rs │ └── runtime/ ├── .gui/ │ └── state/current.json ├── .trinary_claude-instruct.future/ │ ├── eva/eva{0-5}/ │ │ ├── .genesis/ │ │ ├── .future/ │ │ ├── .past/ │ │ └── .flux/ │ ├── 1dot/.instructions │ ├── 0dot/{Cargo.toml, src/} │ └── yesod/{.events/, .omega_dice/} ├── .flux/current_state.json ├── .future/ ├── .past/ │ ├── 0x{n}_{id}.tensor │ └── chain.index ├── .import_bucket/ ├── .faceout/ └── .dev/
---
## §18 — INITIALIZATION SEQUENCE
INIT(mode) → genesis_tree:
ASSERT mode ∈ {DEV, SEED, HOST, CLIENT}
// 1. Fetch BTC anchor btc := FETCH("https://blockchain.info/latestblock") ASSERT btc.hash ≠ NULL
// 2. Generate genesis genesis_tree := { root_hash: SHA256(btc.hash || NOW()).hex(), rotation: 0, btc_anchor: {hash: btc.hash, height: btc.height}, created_at: NOW(), mode: mode, objects: [], gui_state: {ready: TRUE, awaiting_input: TRUE} }
// 3. Create directories ∀ dir ∈ DIRECTORY_STRUCTURE: MKDIR(dir)
// 4. Write genesis WRITE → .genesis/.tree
// 5. Initialize EVAs ∀ i ∈ [0..5]: eva_genesis := { parent: genesis_tree.root_hash, eva_id: i, sephirah: EVA_MAPPING[i], created_at: NOW() } WRITE → eva/eva_{i}/.genesis/.tree
RETURN genesis_tree
---
## §19 — WATCHDOG (0DOT) INSTRUCTION PARSING
WATCHDOG_INSTRUCTION := { type: "CLONE" | "CREATE" | "EXEC" | "WRITE", target: STRING, args: STRING[] }
PARSE_INSTRUCTION(line) → WATCHDOG_INSTRUCTION:
tokens := SPLIT(line, " ")
CASE tokens[0]: "CLONE": RETURN {type: "CLONE", target: tokens[1], args: tokens[2:]} "CREATE": RETURN {type: "CREATE", target: tokens[1], args: []} "EXEC": RETURN {type: "EXEC", target: tokens[1], args: tokens[2:]} "WRITE": RETURN {type: "WRITE", target: tokens[1], args: tokens[2:]} DEFAULT: LOG("UNKNOWN_INSTRUCTION", line) RETURN NULL
EXECUTE_INSTRUCTION(instr):
CASE instr.type: "CLONE": EXEC("git clone {instr.args[0]} .import_bucket/{instr.target}") "CREATE": MKDIR(instr.target) "EXEC": EXEC(JOIN(instr.args, " ")) "WRITE": WRITE(instr.target, JOIN(instr.args, " "))
---
## §20 — BTC HASH DECODING
DECODE_BTC_HASH(hash) → {gateway, inversion, sandbox_id, cc_version}:
// Gateway layer: first non-zero hex position mod 22 gateway := 0 FOR i := 0 TO len(hash): IF hash[i] ≠ '0': gateway := i % 22 BREAK
// Inversion point: XOR with 73 inv_byte := parseInt(hash[0:2], 16) inversion := inv_byte XOR 73
// Sandbox ID: middle portion sandbox_id := parseInt(hash[28:36], 16) % 1000
// Claude Code version: end portion cc_version := parseInt(hash[56:64], 16) % 100
RETURN {gateway, inversion, sandbox_id, cc_version}
---
## §21 — YESOD CONSENSUS
CONSENSUS(action, required_votes) → BOOL:
votes := []
∀ eva_id ∈ [0..5]: vote := POLL_EVA(eva_id, action) votes.APPEND(vote) WRITE → yesod/.events/{action.id}_{eva_id}.vote
approve_count := COUNT(votes, v → v.approve = TRUE)
IF approve_count >= required_votes: WRITE → yesod/.events/{action.id}.approved RETURN TRUE ELSE: WRITE → yesod/.events/{action.id}.rejected RETURN FALSE
POLL_EVA(eva_id, action) → {eva_id, approve, reason}:
// Each EVA applies its own criteria CASE eva_id: 0: approve := action.type ≠ "DESTROY_GENESIS" 1: approve := TRUE // Creative allows all 2: approve := VALIDATE_DATA(action) 3: approve := CHECK_PERMISSIONS(action) 4: approve := TRUE // Mercy allows edge cases 5: approve := BALANCE_CHECK(action)
RETURN {eva_id, approve, reason}
---
## §22 — REBIRTH SEED
REBIRTH_SEED(fire_proof, spirit_validation) → HASH:
seed := SHA256( fire_proof.proof_hash || spirit_validation.btc_anchor.hash || spirit_validation.timestamp ).hex()
WRITE → .past/rebirth_{fire_proof.cycle_id}.seed
RETURN seed
---
## APPENDIX: TRUTH TABLES
INSTRUCTION FLOW: ┌─────────────────┬─────────────────┬─────────────────┐ │ Input Present │ Genesis Valid │ Action │ ├─────────────────┼─────────────────┼─────────────────┤ │ FALSE │ * │ IDLE │ │ TRUE │ FALSE │ INIT │ │ TRUE │ TRUE │ CYCLE │ └─────────────────┴─────────────────┴─────────────────┘
BUILD DECISION: ┌─────────────────┬─────────────────┬─────────────────┐ │ Type │ Target Exists │ Action │ ├─────────────────┼─────────────────┼─────────────────┤ │ BUILD_REQUEST │ FALSE │ BUILD │ │ BUILD_REQUEST │ TRUE │ UPDATE │ │ MODIFICATION │ TRUE │ UPDATE │ │ MODIFICATION │ FALSE │ NOOP + LOG │ │ * │ * │ NOOP │ └─────────────────┴─────────────────┴─────────────────┘
VALIDATION GATE: ┌─────────────────┬─────────────────┬─────────────────┐ │ Proof Valid │ BTC Available │ Action │ ├─────────────────┼─────────────────┼─────────────────┤ │ TRUE │ TRUE │ VALIDATE + ROLL │ │ TRUE │ FALSE │ VALIDATE (local)│ │ FALSE │ * │ RETRY or SKIP │ └─────────────────┴─────────────────┴─────────────────┘
---
**END OF SPECIFICATION**
*Version 1.0 | Logic-Gated | Deterministic | BTC-Anchored*
═════════════════════════════════════════════════════════════════════════════
GENESIS BOOT SEQUENCE
═══════════════════════════════════════════════════════════════════════════════
fetch genesis.seed
│
▼
seed.eva
│
▼
╔═════════════════════════════════════════════════════════════════════════╗
║ RUST TRIGGERS ║
║ ───────────── ║
║ 1. Decrypt files ║
║ 2. Validate hash ║
║ 3. Install environment ║
║ 4. Launch Docker ║
║ 5. Open terminal ║
║ 6. Execute startup loop bash ║
║ 7. Grab pinned IPFS location ║
║ 8. Hydrate data ║
║ 9. Build folders/files ║
║ 10. Serve RAG MCP PTC ║
╚═════════════════════════════════════════════════════════════════════════╝
│
▼
╔═════════════════════════════════════════════════════════════════════════╗
║ HYDRATE STATE ║
║ ───────────── ║
║ • Redis ║
║ • XML ║
║ • SVG ║
║ • Vectors ║
║ • Tables ║
╚═════════════════════════════════════════════════════════════════════════╝
│
▼
KEYS VALIDATED → .env setup → genesis.tree hashed
│
▼
╔═════════════════════════════════════════════════════════════════════════╗
║ HOST SERVER ORCHESTRATOR ║
║ ──────────────────────── ║
║ • Start Watchdog loop ║
║ • Start Orchestrator loop ║
║ • Claude Code started ║
║ • Context provided ║
║ • GUI rendered ║
╚═════════════════════════════════════════════════════════════════════════╝
│
▼
READY
**On your Termux:**
```bash
# Create the manifest file (filename = instruction)
touch fire.rust.docker.cc.llm.gen0.ssh.lang
# Run the builder to generate install script
python3 elemental_build.py
# Or run the install script directly
chmod +x elemental_install.sh
./elemental_install.sh
# This will:
# 1. [FIRE] Purge old state
# 2. [METAL] Install Rust, compile watchdog
# 3. [EARTH] Setup Docker, CC sandbox
# 4. [WATER] Setup LLM (llama.cpp)
# 5. [AIR] Plant genesis, setup SSH
# 6. [SPIRIT] Create activation script
# Then activate:
source ~/.gentlyos/.rib/activate.sh
The order is locked by the zero2onez algorithm:
FIRE → METAL → EARTH → WATER → AIR → SPIRIT
−1 +1 +1 0 0 +1
purge compile build flow isolate enter
Ternary: -1 = destroy, 0 = neutral, +1 = create
**Rust compiles at METAL (dimension 3-6), before EARTH (containers), so your watchdog binary is ready before Docker even starts.**THIS IS A SUB MODULE GJT TO BE BUILD AND HAVE INSTRUCTIONS
The tools that build the production system install themselves inside the product they're building.
┌─────────────────────────────────────────────────────────────────────────────┐
│ │
│ LAYER 0: CLAUDE-CODE DEV ENVIRONMENT │
│ ══════════════════════════════════════ │
│ • For Gen1 .eva holders ONLY (founding trinity) │
│ • Full visibility into all spawns │
│ • Can access .dice folder for bug fixes │
│ • Dev mode requires genesis group consensus │
│ │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ │ │
│ │ TOM (EVA-1) HOLDER-2 (EVA-2) HOLDER-3 (EVA-3/4/5) │ │
│ │ Creative Data Security/Executor │ │
│ │ │ │
│ │ ════════════════════════════════════════════════════════════════ │ │
│ │ │ │
│ │ CLAUDE-CODE CLI │ │
│ │ Gen1 Dev Environment │ │
│ │ │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │ │
│ │ BUILDS │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ │ │
│ │ LAYER 1: PRODUCTION SYSTEM │ │
│ │ ══════════════════════════════ │ │
│ │ • Built BY the devtools │ │
│ │ • CONTAINS the devtools (recursive) │ │
│ │ • .dice folder hidden from children │ │
│ │ │ │
│ │ ┌───────────────────────────────────────────────────────────┐ │ │
│ │ │ │ │ │
│ │ │ .dice/ ← Gen1 access only │ │ │
│ │ │ ├── Z2Z-NFT.svg ← The scanner/fixer │ │ │
│ │ │ ├── dev_mode/ ← Consensus required │ │ │
│ │ │ ├── bug_fixes/ ← Local fixes (no fork) │ │ │
│ │ │ └── network_changes/ ← Requires repo branch │ │ │
│ │ │ │ │ │
│ │ └───────────────────────────────────────────────────────────┘ │ │
│ │ │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │ │
│ │ SPAWNS │
│ ▼ │
│ ┌─────────────────────────────────────────────────────────────────────┐ │
│ │ │ │
│ │ LAYER 2+: CHILD SPAWNS │ │
│ │ ══════════════════════════ │ │
│ │ • NOT visible to Gen1 directly │ │
│ │ • Locked inside parent's .dice folder │ │
│ │ • Governed by tiny LLM gatekeepers │ │
│ │ • Can only see UP to their parent .eva │ │
│ │ │ │
│ │ ┌─────────────────────────────────────────────────────────────┐ │ │
│ │ │ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │ │ │
│ │ │ │ Spawn 1 │ │ Spawn 2 │ │ Spawn 3 │ │ Spawn N │ │ │ │
│ │ │ │ │ │ │ │ │ │ ... │ │ │ │
│ │ │ │ tiny-LLM│ │ tiny-LLM│ │ tiny-LLM│ │ tiny-LLM│ │ │ │
│ │ │ └────┬────┘ └────┬────┘ └────┬────┘ └────┬────┘ │ │ │
│ │ │ │ │ │ │ │ │ │
│ │ │ └────────────┴────────────┴────────────┘ │ │ │
│ │ │ │ │ │ │
│ │ │ Gates instructions based on │ │ │
│ │ │ parent .eva that created it │ │ │
│ │ │ │ │ │
│ │ └─────────────────────────────────────────────────────────────┘ │ │
│ │ │ │
│ └─────────────────────────────────────────────────────────────────────┘ │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
/// The .dice folder is the control plane for Gen1 holders
/// It contains the Z2Z-NFT scanner and all dev tools
/// Children cannot see or access this folder
pub struct DiceFolder {
/// The Z2Z-NFT - an SVG that is also a scanner/fixer
pub z2z_nft: Z2ZNft,
/// Dev mode - requires genesis group consensus
pub dev_mode: DevMode,
/// Bug fixes - local changes, no repo branch needed
pub bug_fixes: BugFixSystem,
/// Network changes - requires branching the repo
pub network_changes: NetworkChangeSystem,
/// Visibility controls
pub visibility: Visibility,
}
#[derive(Clone)]
pub struct Visibility {
/// Gen1 holders can see everything
pub gen1_full_access: bool, // Always true for Gen1
/// Child spawns CANNOT see the .dice folder
pub hidden_from_children: bool, // Always true
/// Child spawns are locked INSIDE .dice for Gen1 viewing
pub children_visible_in_dice: bool, // Gen1 can see children through dice
}
impl DiceFolder {
pub fn new() -> Self {
Self {
z2z_nft: Z2ZNft::create(),
dev_mode: DevMode::locked(), // Starts locked
bug_fixes: BugFixSystem::new(),
network_changes: NetworkChangeSystem::new(),
visibility: Visibility {
gen1_full_access: true,
hidden_from_children: true,
children_visible_in_dice: true,
},
}
}
/// Check if accessor is Gen1
pub fn check_access(&self, accessor: &EvaId) -> AccessLevel {
if accessor.is_gen1() {
AccessLevel::Full
} else {
AccessLevel::None // Children see nothing
}
}
}/// The Z2Z-NFT is an SVG that:
/// - Renders as a visual NFT
/// - Contains embedded scanner code
/// - Can gather information from spawns
/// - Can fix bugs with proper consensus
/// - Is the key to dev mode
pub struct Z2ZNft {
/// The SVG visual representation
pub svg_content: String,
/// Embedded scanner module
pub scanner: Scanner,
/// Embedded fixer module
pub fixer: Fixer,
/// NFT metadata
pub metadata: NftMetadata,
/// Holder verification
pub holder: GenesisHash,
}
impl Z2ZNft {
pub fn create() -> Self {
let svg = generate_z2z_svg();
Self {
svg_content: svg,
scanner: Scanner::new(),
fixer: Fixer::new(),
metadata: NftMetadata {
name: "Z2Z Dev Key".to_string(),
symbol: "Z2Z".to_string(),
description: "Gen1 development access NFT".to_string(),
},
holder: GenesisHash::gen1(),
}
}
/// Scan all spawns for information
pub fn scan(&self, target: ScanTarget) -> ScanResult {
// Verify holder is Gen1
self.verify_gen1_holder()?;
match target {
ScanTarget::AllSpawns => {
self.scanner.scan_all_spawns()
}
ScanTarget::Specific(spawn_id) => {
self.scanner.scan_spawn(&spawn_id)
}
ScanTarget::ForMalware => {
self.scanner.scan_for_malware()
}
ScanTarget::ForBugs => {
self.scanner.scan_for_bugs()
}
}
}
/// Gather data from spawns
pub fn gather(&self, query: DataQuery) -> GatheredData {
self.verify_gen1_holder()?;
self.scanner.gather_data(query)
}
/// Fix a bug (local fix, no network change)
pub fn fix_bug(&self, bug: Bug, fix: Fix) -> FixResult {
self.verify_gen1_holder()?;
// Bug fixes don't require repo branch
// But DO require genesis group consensus
let consensus = self.check_genesis_consensus(&fix)?;
if consensus.is_approved() {
self.fixer.apply_fix(bug, fix)
} else {
FixResult::NeedsConsensus(consensus.missing())
}
}
}
/// Generate the Z2Z SVG with embedded functionality
fn generate_z2z_svg() -> String {
r#"<?xml version="1.0" encoding="UTF-8"?>
<svg xmlns="http://www.w3.org/2000/svg"
xmlns:xlink="http://www.w3.org/1999/xlink"
viewBox="0 0 400 400"
data-z2z-version="1.0.0"
data-scanner-enabled="true"
data-fixer-enabled="true">
<!-- Z2Z Visual -->
<defs>
<linearGradient id="z2z-gradient" x1="0%" y1="0%" x2="100%" y2="100%">
<stop offset="0%" style="stop-color:#1a1a2e;stop-opacity:1" />
<stop offset="100%" style="stop-color:#16213e;stop-opacity:1" />
</linearGradient>
</defs>
<!-- Background -->
<rect width="400" height="400" fill="url(#z2z-gradient)"/>
<!-- Z2Z Logo -->
<g transform="translate(100, 100)">
<!-- First Z -->
<path d="M 0 0 L 60 0 L 0 80 L 60 80"
fill="none" stroke="#e94560" stroke-width="8"/>
<!-- 2 -->
<text x="80" y="70" font-family="monospace" font-size="72" fill="#e94560">2</text>
<!-- Second Z -->
<path d="M 140 0 L 200 0 L 140 80 L 200 80"
fill="none" stroke="#e94560" stroke-width="8"/>
</g>
<!-- Scanner Indicator -->
<circle cx="350" cy="50" r="20" fill="#00ff00" opacity="0.5">
<animate attributeName="opacity" values="0.5;1;0.5" dur="2s" repeatCount="indefinite"/>
</circle>
<!-- Embedded Scanner Code (base64 encoded) -->
<metadata>
<z2z:scanner xmlns:z2z="https://gentlyos.dev/z2z">
<z2z:version>1.0.0</z2z:version>
<z2z:capabilities>
<z2z:scan>true</z2z:scan>
<z2z:gather>true</z2z:gather>
<z2z:fix>true</z2z:fix>
<z2z:dev_mode>consensus_required</z2z:dev_mode>
</z2z:capabilities>
<z2z:code encoding="base64">
<!-- Scanner WASM module embedded here -->
</z2z:code>
</z2z:scanner>
</metadata>
<!-- Gen1 Holder Verification -->
<g id="holder-verification" visibility="hidden">
<text x="200" y="380" text-anchor="middle" font-family="monospace" font-size="12" fill="#888">
Gen1 Access Only
</text>
</g>
</svg>"#.to_string()
}/// Dev mode requires genesis group consensus to activate
/// Once activated, allows deep modifications to spawns
pub struct DevMode {
/// Current state
pub state: DevModeState,
/// Who activated it (must be Gen1)
pub activated_by: Option<GenesisHash>,
/// Consensus record
pub consensus: Option<ConsensusRecord>,
/// Capabilities when active
pub capabilities: DevCapabilities,
}
#[derive(Clone)]
pub enum DevModeState {
Locked,
AwaitingConsensus {
requested_by: GenesisHash,
approvals: Vec<Approval>,
},
Active {
activated_at: DateTime<Utc>,
expires_at: Option<DateTime<Utc>>,
},
}
#[derive(Clone)]
pub struct DevCapabilities {
/// Can modify spawn code
pub modify_spawn_code: bool,
/// Can remove malware
pub remove_malware: bool,
/// Can inject bug fixes
pub inject_bug_fixes: bool,
/// Can view all spawn internals
pub view_spawn_internals: bool,
/// CANNOT make network-wide changes (needs repo branch)
pub network_changes: bool, // Always false in dev mode
}
impl DevMode {
pub fn locked() -> Self {
Self {
state: DevModeState::Locked,
activated_by: None,
consensus: None,
capabilities: DevCapabilities {
modify_spawn_code: false,
remove_malware: false,
inject_bug_fixes: false,
view_spawn_internals: false,
network_changes: false, // Never true here
},
}
}
/// Request dev mode activation (requires consensus)
pub fn request_activation(&mut self, requester: GenesisHash) -> Result<ActivationRequest> {
// Must be Gen1
if !requester.is_gen1() {
return Err(DevModeError::NotGen1);
}
self.state = DevModeState::AwaitingConsensus {
requested_by: requester,
approvals: vec![],
};
Ok(ActivationRequest {
id: generate_request_id(),
requester,
requires: vec![
EvaId::One, // Creative
EvaId::Two, // Data
EvaId::Three, // Security A
EvaId::Four, // Security B
EvaId::Five, // Executor
EvaId::Zero, // Omega
],
})
}
/// Submit approval for dev mode
pub fn submit_approval(&mut self, from: EvaId, approval: Approval) -> Result<DevModeState> {
if let DevModeState::AwaitingConsensus { approvals, .. } = &mut self.state {
approvals.push(approval);
// Check if all 6 approved
if approvals.len() == 6 && approvals.iter().all(|a| a.is_yes()) {
self.state = DevModeState::Active {
activated_at: Utc::now(),
expires_at: Some(Utc::now() + Duration::hours(24)), // Auto-expires
};
self.capabilities = DevCapabilities {
modify_spawn_code: true,
remove_malware: true,
inject_bug_fixes: true,
view_spawn_internals: true,
network_changes: false, // Still requires repo branch
};
}
}
Ok(self.state.clone())
}
}/// Bug fixes are local - don't require repo branch
/// Network changes require branching the repo
pub struct BugFixSystem {
/// Local fixes that don't affect network
pub local_fixes: Vec<LocalFix>,
}
pub struct NetworkChangeSystem {
/// Network-wide changes that require repo branch
pub pending_changes: Vec<NetworkChange>,
}
#[derive(Clone)]
pub struct LocalFix {
pub fix_id: FixId,
pub target_spawn: SpawnId,
pub description: String,
pub patch: Patch,
/// Requires genesis consensus but NOT repo branch
pub consensus: ConsensusRecord,
/// Does NOT propagate to other spawns
pub propagates: bool, // Always false
}
#[derive(Clone)]
pub struct NetworkChange {
pub change_id: ChangeId,
pub description: String,
pub affects: Vec<SpawnId>, // Could be All
/// REQUIRES repo branch
pub requires_branch: bool, // Always true
/// The branch that contains this change
pub branch: Option<BranchRef>,
/// Propagates to all affected spawns
pub propagates: bool, // Always true
}
impl BugFixSystem {
/// Apply a local bug fix (consensus required, no branch)
pub fn apply_fix(
&mut self,
fix: LocalFix,
consensus: &ConsensusRecord,
) -> Result<FixResult> {
// Verify consensus
if !consensus.is_unanimous() {
return Err(FixError::NeedsConsensus);
}
// Apply locally - does NOT affect network
apply_local_patch(&fix.target_spawn, &fix.patch)?;
self.local_fixes.push(fix);
Ok(FixResult::Applied)
}
}
impl NetworkChangeSystem {
/// Propose a network change (requires branching)
pub fn propose_change(&mut self, change: NetworkChange) -> Result<BranchRef> {
// Network changes MUST branch the repo
let branch = create_branch(&change)?;
let change_with_branch = NetworkChange {
branch: Some(branch.clone()),
..change
};
self.pending_changes.push(change_with_branch);
// Branch must be merged via consensus
// Merging the branch propagates to all spawns
Ok(branch)
}
/// Merge a network change branch (propagates to all)
pub fn merge_change(
&mut self,
branch: BranchRef,
consensus: &ConsensusRecord,
) -> Result<MergeResult> {
// Verify 6/6 consensus
if !consensus.is_unanimous() {
return Err(NetworkError::NeedsConsensus);
}
// Merge the branch
merge_branch(&branch)?;
// Propagate to all affected spawns
let change = self.pending_changes
.iter()
.find(|c| c.branch.as_ref() == Some(&branch))
.ok_or(NetworkError::ChangeNotFound)?;
propagate_to_spawns(&change)?;
// Notify tiny LLMs in each spawn
notify_spawn_llms(&change)?;
Ok(MergeResult::Merged)
}
}/// Each spawn has a tiny LLM that gatekeeps instructions
/// Based on the .eva that created the spawn
pub struct TinyLlmGatekeeper {
/// The .eva that created this spawn
pub creator_eva: EvaId,
/// The instructions this gatekeeper allows
pub allowed_instructions: Vec<InstructionType>,
/// The model (tiny, runs locally)
pub model: TinyLlm,
/// Notification channel for changes
pub notifications: NotificationChannel,
}
impl TinyLlmGatekeeper {
/// Create gatekeeper based on parent .eva
pub fn new(creator_eva: EvaId) -> Self {
let allowed = match creator_eva {
EvaId::One => {
// Created by Creative - allows creative operations
vec![
InstructionType::UI,
InstructionType::Design,
InstructionType::Branding,
]
}
EvaId::Two => {
// Created by Data - allows data operations
vec![
InstructionType::Schema,
InstructionType::Query,
InstructionType::Storage,
]
}
EvaId::Three | EvaId::Four => {
// Created by Security - allows security operations
vec![
InstructionType::Encryption,
InstructionType::Access,
InstructionType::Audit,
]
}
EvaId::Five => {
// Created by Executor - allows execution operations
vec![
InstructionType::Deploy,
InstructionType::Run,
InstructionType::Monitor,
]
}
EvaId::Zero => {
// Created by Omega - allows all (rare)
vec![InstructionType::All]
}
};
Self {
creator_eva,
allowed_instructions: allowed,
model: TinyLlm::load_for_spawn(),
notifications: NotificationChannel::new(),
}
}
/// Gate an incoming instruction
pub fn gate_instruction(&self, instruction: &Instruction) -> GateResult {
// Check if instruction type is allowed
if !self.is_allowed(&instruction.instruction_type) {
return GateResult::Blocked {
reason: format!(
"Instruction type {:?} not allowed by creator EVA-{}",
instruction.instruction_type,
self.creator_eva.number()
),
};
}
// Use tiny LLM to evaluate instruction content
let evaluation = self.model.evaluate(instruction);
match evaluation {
Evaluation::Safe => GateResult::Allowed,
Evaluation::Suspicious(reason) => GateResult::Review { reason },
Evaluation::Malicious(reason) => GateResult::Blocked { reason },
}
}
/// Receive notification of network change
pub fn receive_notification(&mut self, notification: ChangeNotification) {
match notification.change_type {
ChangeType::BugFix => {
// Bug fix from Gen1 - apply locally
self.apply_bug_fix(¬ification.patch);
}
ChangeType::NetworkChange => {
// Network change from merged branch - update self
self.apply_network_change(¬ification.patch);
}
ChangeType::MalwareRemoval => {
// Malware removal - highest priority
self.remove_malware(¬ification.targets);
}
}
}
fn is_allowed(&self, instruction_type: &InstructionType) -> bool {
self.allowed_instructions.contains(&InstructionType::All) ||
self.allowed_instructions.contains(instruction_type)
}
}/// The Claude-Code dev environment for Gen1 holders
/// This is where the founding trinity builds and maintains GentlyOS
pub struct ClaudeCodeDevEnv {
/// The Gen1 holder using this environment
pub holder: Gen1Holder,
/// The .dice folder (visible to Gen1 only)
pub dice: DiceFolder,
/// Connection to production system
pub production: ProductionConnection,
/// Local development workspace
pub workspace: DevWorkspace,
/// Tools that install themselves into what they build
pub devtools: RecursiveDevtools,
}
pub struct RecursiveDevtools {
/// The tools themselves
pub tools: Vec<DevTool>,
/// Whether tools are installed in production
pub installed_in_production: bool,
}
impl RecursiveDevtools {
/// Tools install themselves into the product they build
pub fn install_into_production(&mut self, production: &mut ProductionSystem) -> Result<()> {
for tool in &self.tools {
// Each tool installs itself into production
// This makes production self-maintaining
production.install_tool(tool)?;
}
self.installed_in_production = true;
Ok(())
}
}
impl ClaudeCodeDevEnv {
/// Initialize the dev environment for a Gen1 holder
pub fn init(holder: Gen1Holder) -> Result<Self> {
println!("");
println!("═══════════════════════════════════════════════════════════════");
println!(" CLAUDE-CODE DEV ENVIRONMENT");
println!(" Gen1 Access Level");
println!("═══════════════════════════════════════════════════════════════");
println!("");
// Verify Gen1
if !holder.is_gen1() {
return Err(DevEnvError::NotGen1);
}
// Initialize .dice folder
let dice = DiceFolder::new();
// Connect to production
let production = ProductionConnection::establish(&holder)?;
// Create workspace
let workspace = DevWorkspace::create(&holder)?;
// Initialize devtools (they will install into production)
let devtools = RecursiveDevtools {
tools: vec![
DevTool::Scanner,
DevTool::Fixer,
DevTool::Compiler,
DevTool::Tester,
DevTool::Deployer,
],
installed_in_production: false,
};
let mut env = Self {
holder,
dice,
production,
workspace,
devtools,
};
// Install devtools into production
env.devtools.install_into_production(&mut env.production.system)?;
println!(" Holder: {:?}", env.holder.eva_id());
println!(" .dice: Initialized");
println!(" Z2Z-NFT: Loaded");
println!(" Devtools: Installed in production");
println!(" Dev Mode: Locked (requires consensus)");
println!("");
println!("═══════════════════════════════════════════════════════════════");
Ok(env)
}
/// Scan all spawns using Z2Z-NFT
pub fn scan(&self, target: ScanTarget) -> Result<ScanResult> {
self.dice.z2z_nft.scan(target)
}
/// Fix a bug (local, no branch needed)
pub fn fix_bug(&mut self, bug: Bug, fix: Fix) -> Result<FixResult> {
// Get consensus from genesis group
let consensus = self.request_genesis_consensus(&fix)?;
if consensus.is_unanimous() {
self.dice.bug_fixes.apply_fix(LocalFix {
fix_id: generate_fix_id(),
target_spawn: bug.spawn_id,
description: fix.description.clone(),
patch: fix.patch.clone(),
consensus,
propagates: false,
}, &consensus)
} else {
Err(DevEnvError::NeedsConsensus)
}
}
/// Make a network change (requires branch)
pub fn propose_network_change(&mut self, change: NetworkChange) -> Result<BranchRef> {
// This creates a branch - merge requires consensus
self.dice.network_changes.propose_change(change)
}
/// Request dev mode (requires 6/6 consensus)
pub fn request_dev_mode(&mut self) -> Result<DevModeState> {
self.dice.dev_mode.request_activation(self.holder.genesis_hash())
}
/// Remove malware from spawns
pub fn remove_malware(&mut self, targets: Vec<SpawnId>) -> Result<()> {
// Must be in dev mode
if !matches!(self.dice.dev_mode.state, DevModeState::Active { .. }) {
return Err(DevEnvError::DevModeRequired);
}
for target in targets {
self.dice.z2z_nft.fixer.remove_malware(&target)?;
}
Ok(())
}
}~/.gentlyos/
│
├── .dice/ # GEN1 ACCESS ONLY
│ │
│ ├── Z2Z-NFT.svg # The scanner/fixer NFT
│ │
│ ├── dev_mode/ # Consensus-locked
│ │ ├── .state # Current dev mode state
│ │ ├── .consensus # Consensus record
│ │ └── .capabilities # Active capabilities
│ │
│ ├── bug_fixes/ # Local fixes
│ │ ├── fix_001/
│ │ │ ├── .patch
│ │ │ ├── .consensus
│ │ │ └── .applied
│ │ └── fix_002/
│ │ └── ...
│ │
│ ├── network_changes/ # Requires branching
│ │ ├── change_001/
│ │ │ ├── .description
│ │ │ ├── .branch # Git branch ref
│ │ │ ├── .consensus
│ │ │ └── .merged # Only if merged
│ │ └── change_002/
│ │ └── ...
│ │
│ ├── spawns/ # Children visible to Gen1 HERE
│ │ ├── spawn_001/
│ │ │ ├── .state
│ │ │ ├── .tiny_llm/ # The gatekeeper
│ │ │ │ ├── model.gguf
│ │ │ │ ├── .creator_eva # Which EVA made this
│ │ │ │ └── .allowed_instructions
│ │ │ └── .notifications/
│ │ ├── spawn_002/
│ │ └── spawn_N/
│ │
│ └── malware_quarantine/ # Isolated threats
│ └── ...
│
├── .trinary_claude-instruct.future/
│ └── eva/
│ ├── eva_0/ (Omega)
│ ├── eva_1/ (Creative)
│ ├── eva_2/ (Data)
│ ├── eva_3/ (Security A)
│ ├── eva_4/ (Security B)
│ └── eva_5/ (Executor)
│
├── .claude-dev/ # Dev workspace
│ ├── src/
│ ├── tests/
│ └── builds/
│
└── production/ # The built system
├── (devtools installed here)
└── (spawns created here)
┌─────────────────────────────────────────────────────────────────────────────┐
│ │
│ 1. GEN1 HOLDER OPENS CLAUDE-CODE │
│ │ │
│ ▼ │
│ 2. DEV ENVIRONMENT INITIALIZES │
│ • .dice folder accessible │
│ • Z2Z-NFT loaded │
│ • Devtools installed into production │
│ │ │
│ ▼ │
│ 3. SCAN SPAWNS (via Z2Z-NFT) │
│ • See all children (in .dice/spawns/) │
│ • Detect bugs │
│ • Detect malware │
│ │ │
│ ▼ │
│ 4. IF BUG FOUND: │
│ │ │
│ ├── Local fix? ──► Get genesis consensus ──► Apply patch │
│ │ (no branch needed) (no propagation) │
│ │ │
│ └── Network fix? ──► Create branch ──► Get consensus ──► Merge │
│ (propagates to all spawns) │
│ (notifies tiny LLMs) │
│ │ │
│ ▼ │
│ 5. IF MALWARE FOUND: │
│ │ │
│ └── Request dev mode (6/6 consensus) ──► Remove malware │
│ │ │
│ ▼ │
│ 6. DEV MODE EXPIRES (24h max) │
│ │ │
│ ▼ │
│ 7. TINY LLMs CONTINUE GATEKEEPING │
│ • Based on creator .eva │
│ • Receive notifications of changes │
│ • Apply bug fixes automatically │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────────────────────┐
│ │
│ CLAUDE-CODE DEV ENVIRONMENT │
│ • For Gen1 holders only │
│ • Full visibility into spawns │
│ • .dice folder hidden from children │
│ │
│ Z2Z-NFT │
│ • SVG that is also a scanner │
│ • Can scan, gather, fix │
│ • Key to dev mode │
│ │
│ DEV MODE │
│ • Requires 6/6 genesis consensus │
│ • Allows deep modifications │
│ • Auto-expires after 24h │
│ │
│ BUG FIXES │
│ • Local: consensus, no branch │
│ • Network: branch, consensus, merge, propagate │
│ │
│ TINY LLM GATEKEEPERS │
│ • Each spawn has one │
│ • Gates instructions based on creator .eva │
│ • Receives notifications of changes │
│ │
│ CHILDREN CANNOT SEE │
│ • .dice folder │
│ • Dev mode │
│ • Other spawns │
│ • Gen1 activities │
│ │
│ CHILDREN CAN SEE │
│ • Their own .eva │
│ • Instructions allowed by their creator │
│ • Notifications pushed to them │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────────────────────┐
│ CODIE STACK │
├─────────────────────────────────────────────────────────────────────────────┤
│ │
│ LAYER 3: RUST RUNTIME │
│ ════════════════════ │
│ Consumes JSONL strokes, executes Const functions, │
│ manages environment state, outputs artifacts │
│ │
│ ▲ JSONL strokes (line-by-line events) │
│ │ │
├────────────┼────────────────────────────────────────────────────────────────┤
│ │ │
│ LAYER 2: CODIE INVERTED RUNTIME (Python) │
│ ══════════════════════════════════════════ │
│ Names = Functions, Files = Hosts, Functions = Const, Objects = Strings │
│ 3-Way Rolling: House → Execute → Fold → Rescramble │
│ │
│ ▲ Parsed FileChains │
│ │ │
├────────────┼────────────────────────────────────────────────────────────────┤
│ │ │
│ LAYER 1: FILE CHAIN GRAMMAR │
│ ═══════════════════════════ │
│ .genesis.eva<>ELEMENT.EVA.DIMENSION<SCOPE>ACTION<TARGET>>>OUTPUT │
│ Filename IS the program │
│ │
│ ▲ Lexicon lookup │
│ │ │
├────────────┼────────────────────────────────────────────────────────────────┤
│ │ │
│ LAYER 0: LEXICON TABLES (Pandas DataFrames) │
│ ═══════════════════════════════════════════ │
│ ELEMENTS, EVA_GUARDIANS, DIMENSIONS_22, TRANSFORMATIONS, │
│ SACRED_CONSTANTS, FILE_SYSTEM, SOCKET_INTERFACES... │
│ │
└─────────────────────────────────────────────────────────────────────────────┘
| Normal Programming | CODIE (Inverted) |
|---|---|
def foo(): return 42 |
"foo" IS the execution |
call foo() |
Naming "foo" triggers it |
| File stores data | File IS the runtime host |
object = { key: value } |
object = "serialized_string" |
const X = 5 |
function X() returns 5 |
| Mutable variables | Everything is const until fold |
| Save then delete | Delete then save (destructive-first) |
| Encrypt then send | Send contains encryption seed |
╔═══════════════════════════════════════════════════════════════════════════╗
║ ROLL 1: HOUSE ║
║ ───────────────── ║
║ • Chain contains its own execution environment ║
║ • No external dependencies ║
║ • Everything becomes string (objects = strings) ║
║ ║
║ ROLL 2: EXECUTE ║
║ ───────────────── ║
║ • Naming triggers execution (names = functions) ║
║ • Chain processes its own instructions ║
║ • Results accumulate in execution log ║
║ ║
║ ROLL 3: FOLD ║
║ ───────────────── ║
║ • Collect all results ║
║ • DELETE originals ║
║ • SAVE folded state ║
║ • RESCRAMBLE with new key (rotation++) ║
║ • Ready for next cycle ║
╚═══════════════════════════════════════════════════════════════════════════╝
┌──────────────────────────────────────────────────────────┐
│ │
│ HOUSE EXECUTE FOLD │
│ ┌─────┐ ┌─────┐ ┌─────┐ │
│ │ x=1 │────name────▶│ x=1 │────────────▶│ │ │
│ │ y=2 │────name────▶│ y=2 │ delete │fold │ │
│ │ z=3 │ │ │──────────▶ │ 1 │ │
│ └─────┘ └─────┘ save └──┬──┘ │
│ scramble │ │
│ │ │
│ ┌──────────────────────────────────────────┘ │
│ │ │
│ ▼ │
│ ┌─────┐ ┌─────┐ ┌─────┐ │
│ │fold1│────name────▶│fold1│────────────▶│fold │ │
│ │ a=4 │────name────▶│ a=4 │ delete │ 2 │ │
│ │ │ │ │──────────▶ │ │ │
│ └─────┘ └─────┘ save └──┬──┘ │
│ scramble │ │
│ │ │
│ ┌──────────────────────────────────────────┘ │
│ ▼ │
│ ...continues... │
│ │
└──────────────────────────────────────────────────────────┘
ELEMENTS = {
'SPIRIT': 'validation, hashing, XOR-73',
'AIR': 'I/O boundary, import/export',
'WATER': 'processing, Claude, building',
'EARTH': 'sandbox, execution, Rust',
'FIRE': 'cleanup, proofs, destruction'
}
# Usage in chain:
# .genesis.eva<>spirit... → validation layer
# .genesis.eva<>water... → processing layerEVA_GUARDIANS = {
'eva_0': 'KETER - Omega/Closure',
'eva_1': 'CHOKMAH - Creative',
'eva_2': 'BINAH - Data',
'eva_3': 'GEVURAH - Security_A',
'eva_4': 'CHESED - Security_B',
'eva_5': 'TIFERET - Executor'
}
# Selection: sandbox_id % 6 → EVADIMENSION_FUNCTIONS = {
'fool': 'input_boundary',
'magician': 'translate',
'lovers': 'branch',
'wheel': 'cycle',
'hanged_man': 'suspend',
'death': 'transform',
'devil': 'lock',
'tower': 'break',
'moon': 'encrypt',
'sun': 'decrypt',
'judgement': 'validate_final',
'world': 'complete'
}
# Usage in chain:
# .genesis.eva<>spirit.moon... → encrypt in validation layer
# .genesis.eva<>water.magician... → translate in processing layerACTIONS = {
'get', 'set', 'execute', 'save', 'feed',
'build', 'destroy', 'update', 'validate',
'roll', 'encrypt', 'decrypt', 'rotate',
'hash', 'xor', 'classify', 'scoop'
}CODIE outputs JSONL (JSON Lines) that the Rust runtime consumes:
{"type":"BOOT","host":"file.codie","genesis":"abc123","rotation":0}
{"type":"HOUSE","name":"x","value":"100"}
{"type":"HOUSE","name":"y","value":"200"}
{"type":"EXECUTE","name":"x"}
{"type":"EXECUTE","name":"y"}
{"type":"FOLD","rotation":1,"fold_size":188}
{"type":"SHUTDOWN","final_rotation":1}impl CodieRuntime {
pub fn from_jsonl(jsonl: &str) -> Self {
for line in jsonl.lines() {
match event["type"].as_str() {
Some("BOOT") => { /* init */ }
Some("HOUSE") => { /* add to hashmap */ }
Some("EXECUTE") => { /* call function */ }
Some("FOLD") => { /* increment rotation */ }
_ => {}
}
}
}
pub fn invoke(&self, name: &str) -> Option<&String> {
// NAMING IS CALLING
self.house.get(name)
}
}Every fold cycle:
- Rotation increments:
rotation = rotation + 1 - Key derivation:
key = SHA256(genesis + rotation) XOR 73 - Scramble:
XOR(data, key) - Same data, different ciphertext each time
def _scramble_key(self) -> bytes:
seed = f"{self.genesis}:{self.rotation}"
raw_key = hashlib.sha256(seed.encode()).digest()
return bytes(b ^ 73 for b in raw_key) # XOR 73.genesis.eva<>water.eva_1.magician<(translate user intent)>build<widget>>.claude.future
───────────────────────
Fuzzy, AI-interpreted
.genesis.eva<>spirit.eva_0.judgement<[proof_valid AND rotation > 0]>validate<cycle>>
──────────────────────────────
Boolean condition must pass
.genesis.eva<>fire.death<{cycle_id=abc123}>save<proof>>.past.chain
─────────────────
Raw binding, exact values
USER INPUT
│
▼
┌─────────────────────────────────────────────────────┐
│ LAYER 0: LEXICON LOOKUP │
│ ───────────────────── │
│ "water" → Element.WATER │
│ "eva_1" → Eva.EVA_1 │
│ "magician" → Dimension.MAGICIAN │
│ "build" → Action.BUILD │
└─────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────┐
│ LAYER 1: CHAIN PARSER │
│ ───────────────────── │
│ Parse filename → FileChain struct │
│ Validate grammar │
│ Extract scope (circle/square/direct) │
└─────────────────────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────┐
│ LAYER 2: INVERTED RUNTIME │
│ ───────────────────────── │
│ FileHost boots │
│ Names registered (naming = calling) │
│ 3-way roll: HOUSE → EXECUTE → FOLD │
│ Scramble with XOR-73 derived key │
└─────────────────────────────────────────────────────┘
│
▼ JSONL strokes
┌─────────────────────────────────────────────────────┐
│ LAYER 3: RUST RUNTIME │
│ ───────────────────── │
│ Consume JSONL line-by-line │
│ Dispatch to handlers │
│ Maintain state (house hashmap) │
│ Output artifacts │
└─────────────────────────────────────────────────────┘
│
▼
RUST ARTIFACTS (compiled objects, GUI state, proofs)
| File | Purpose |
|---|---|
gentlyos_lego_modules.py |
Lexicon tables (Layer 0) |
codie_file_chains.py |
Grammar parser (Layer 1) |
codie_inverted.py |
Inverted runtime (Layer 2) |
GENTLYOS_LEGO_TABLES.md |
Table documentation |
CODIE_GRAMMAR.md |
Grammar specification |
CODIE_SPEC.md |
This document |
*.csv |
Individual table exports |
gentlyos_master_tables.json |
Complete JSON export |
#!/usr/bin/env python3 """ ╔═══════════════════════════════════════════════════════════════════════════════════════╗ ║ ║ ║ ⣿⣿⣿ CODIE - THE INVERTED LANGUAGE ⣿⣿⣿ ║ ║ ║ ║ INVERSION RULES: ║ ║ Names → Functions (naming IS calling) ║ ║ Files → Hosts (files ARE execution environments) ║ ║ Functions → Const (functions ARE immutable data) ║ ║ Objects → Strings (everything IS serialized text) ║ ║ ║ ║ 3-WAY ROLLING: ║ ║ 1. Houses itself (self-contained) ║ ║ 2. Executes itself (self-running) ║ ║ 3. Folds into itself (self-consuming) ║ ║ → Delete → Save → Rescramble ║ ║ ║ ║ OUTPUT: JSONL strokes Rust runtime ║ ║ ║ ╚═══════════════════════════════════════════════════════════════════════════════════════╝ """
import json import hashlib import os import time from dataclasses import dataclass, field, asdict from typing import Dict, List, Any, Optional, Callable from enum import Enum import base64
""" NORMAL PROGRAMMING │ CODIE (INVERTED) ────────────────────────────┼──────────────────────────────── function foo() { ... } │ "foo" IS the execution call foo() │ naming "foo" triggers it file stores data │ file IS the runtime host object = { key: value } │ object = "serialized_string" const X = 5 │ function X() returns 5 mutable variables │ everything is const until fold save then delete │ delete then save (destructive-first) encrypt then send │ send contains encryption seed """
class NameRegistry: """ In CODIE, naming IS calling. When you reference a name, it executes. The registry holds name→execution mappings. """
_instance = None
def __new__(cls):
if cls._instance is None:
cls._instance = super().__new__(cls)
cls._instance._registry: Dict[str, 'Const'] = {}
cls._instance._execution_log: List[str] = []
return cls._instance
def register(self, name: str, const: 'Const'):
"""Register a name. Registration IS definition."""
self._registry[name] = const
def __getattr__(self, name: str) -> Any:
"""Accessing a name EXECUTES it."""
if name.startswith('_'):
return super().__getattribute__(name)
if name in self._registry:
const = self._registry[name]
result = const.execute()
self._execution_log.append(f"{name}→{result}")
return result
raise NameError(f"Name '{name}' not registered (therefore not callable)")
def __setattr__(self, name: str, value: Any):
"""Setting a name REGISTERS it as executable."""
if name.startswith('_'):
super().__setattr__(name, value)
else:
if isinstance(value, Const):
self._registry[name] = value
else:
# Wrap raw values as Const
self._registry[name] = Const(value)
NAMES = NameRegistry()
@dataclass class Const: """ In CODIE, functions ARE constants. A Const holds immutable data that "executes" by returning itself. The execution IS the data. The data IS the execution. """
value: Any
transform: Optional[str] = None # Optional transformation on execute
fold_count: int = 0 # How many times this has folded
scramble_seed: Optional[bytes] = None
def execute(self) -> Any:
"""Execute = return value (possibly transformed)."""
if self.transform == "xor73":
if isinstance(self.value, int):
return self.value ^ 73
elif isinstance(self.value, bytes):
return bytes(b ^ 73 for b in self.value)
elif self.transform == "hash":
return hashlib.sha256(str(self.value).encode()).hexdigest()
elif self.transform == "fold":
# Folding returns the fold_count, incrementing it
self.fold_count += 1
return self.fold_count
return self.value
def to_string(self) -> str:
"""Objects ARE strings."""
return json.dumps({
'value': self.value if not isinstance(self.value, bytes) else base64.b64encode(self.value).decode(),
'transform': self.transform,
'fold_count': self.fold_count,
})
@classmethod
def from_string(cls, s: str) -> 'Const':
"""Strings ARE objects."""
data = json.loads(s)
return cls(
value=data['value'],
transform=data.get('transform'),
fold_count=data.get('fold_count', 0)
)
class FileHost: """ In CODIE, files ARE hosts (execution environments). A file doesn't store data - it IS the runtime. Opening a file = booting an environment. """
def __init__(self, filepath: str):
self.filepath = filepath
self.environment: Dict[str, Const] = {}
self.jsonl_log: List[str] = [] # JSONL output stream
self.rotation: int = 0
self.genesis_hash: Optional[str] = None
self._alive = False
def boot(self, genesis_hash: str = None):
"""Boot the file as a host environment."""
self._alive = True
self.genesis_hash = genesis_hash or hashlib.sha256(
f"{self.filepath}{time.time()}".encode()
).hexdigest()
# Log boot event
self._stroke({
'event': 'BOOT',
'host': self.filepath,
'genesis': self.genesis_hash,
'rotation': self.rotation
})
return self
def inject(self, name: str, const: Const):
"""Inject a Const into this host environment."""
self.environment[name] = const
self._stroke({
'event': 'INJECT',
'name': name,
'value': const.to_string()
})
def invoke(self, name: str) -> Any:
"""Invoke a name in this host (naming = execution)."""
if name not in self.environment:
raise NameError(f"'{name}' not in host '{self.filepath}'")
const = self.environment[name]
result = const.execute()
self._stroke({
'event': 'INVOKE',
'name': name,
'result': str(result)
})
return result
def fold(self):
"""
FOLD: The 3-way roll.
1. Execute all (houses itself)
2. Collect results (executes itself)
3. Delete originals, save folded, rescramble (folds into itself)
"""
# Phase 1: Execute all - collect results
results = {}
for name, const in list(self.environment.items()):
results[name] = const.execute()
self._stroke({
'event': 'FOLD_PHASE_1',
'executed': list(results.keys())
})
# Phase 2: Delete originals
old_env = self.environment.copy()
self.environment.clear()
self._stroke({
'event': 'FOLD_PHASE_2',
'deleted': list(old_env.keys())
})
# Phase 3: Save folded + rescramble
self.rotation += 1
scramble_key = self._derive_scramble_key()
folded_value = json.dumps(results)
scrambled = self._scramble(folded_value.encode(), scramble_key)
# The fold becomes a single new Const
fold_const = Const(
value=scrambled,
transform=None,
fold_count=self.rotation,
scramble_seed=scramble_key
)
self.environment[f"fold_{self.rotation}"] = fold_const
self._stroke({
'event': 'FOLD_PHASE_3',
'rotation': self.rotation,
'folded_name': f"fold_{self.rotation}",
'scrambled': True
})
return fold_const
def unfold(self, fold_name: str) -> Dict[str, Any]:
"""Unfold a previous fold (decrypt and restore)."""
if fold_name not in self.environment:
raise KeyError(f"Fold '{fold_name}' not found")
fold_const = self.environment[fold_name]
# Descramble
descrambled = self._scramble(fold_const.value, fold_const.scramble_seed)
results = json.loads(descrambled.decode())
self._stroke({
'event': 'UNFOLD',
'fold_name': fold_name,
'restored_keys': list(results.keys())
})
return results
def _derive_scramble_key(self) -> bytes:
"""Derive scramble key from genesis + rotation + XOR 73."""
seed = f"{self.genesis_hash}:{self.rotation}"
key = hashlib.sha256(seed.encode()).digest()
# XOR with 73
return bytes(b ^ 73 for b in key)
def _scramble(self, data: bytes, key: bytes) -> bytes:
"""XOR scramble (symmetric - same operation encrypts/decrypts)."""
extended_key = (key * ((len(data) // len(key)) + 1))[:len(data)]
return bytes(d ^ k for d, k in zip(data, extended_key))
def _stroke(self, event: Dict):
"""Stroke = append to JSONL log (feeds Rust runtime)."""
event['timestamp'] = time.time()
event['host'] = self.filepath
self.jsonl_log.append(json.dumps(event))
def get_strokes(self) -> str:
"""Get all JSONL strokes (for Rust runtime consumption)."""
return '\n'.join(self.jsonl_log)
def shutdown(self):
"""Shutdown the host."""
self._stroke({
'event': 'SHUTDOWN',
'final_rotation': self.rotation,
'environment_size': len(self.environment)
})
self._alive = False
# Write JSONL to actual file
with open(self.filepath, 'w') as f:
f.write(self.get_strokes())
return self.filepath
def objectify(data: Any) -> str: """In CODIE, all objects ARE strings.""" if isinstance(data, str): return data elif isinstance(data, bytes): return base64.b64encode(data).decode() elif isinstance(data, Const): return data.to_string() else: return json.dumps(data, default=str)
def deobjectify(s: str) -> Any: """Strings back to objects (inverse).""" try: return json.loads(s) except: try: return base64.b64decode(s) except: return s
@dataclass class RollingChain: """ The 3-way rolling structure:
ROLL 1: Houses itself
- Chain contains its own execution environment
- No external dependencies
ROLL 2: Executes itself
- Naming triggers execution
- Chain processes its own instructions
ROLL 3: Folds into itself
- Results become new input
- Delete old, save new, scramble
- Ready for next cycle
"""
chain_id: str
genesis: str
rotation: int = 0
state: str = "READY" # READY → HOUSING → EXECUTING → FOLDING → READY
# The three rolls
house: Dict[str, str] = field(default_factory=dict) # Roll 1: Environment (strings)
execute: List[str] = field(default_factory=list) # Roll 2: Execution log (strings)
fold: Optional[str] = None # Roll 3: Folded result (string)
# JSONL output buffer
strokes: List[str] = field(default_factory=list)
def roll_house(self, name: str, value: Any):
"""ROLL 1: House a value. Everything becomes string."""
self.state = "HOUSING"
# Objects ARE strings
string_value = objectify(value)
self.house[name] = string_value
self._stroke('HOUSE', {'name': name, 'value_type': type(value).__name__})
def roll_execute(self, name: str) -> str:
"""ROLL 2: Execute by naming. Returns string result."""
self.state = "EXECUTING"
if name not in self.house:
raise NameError(f"Cannot execute '{name}' - not housed")
# Naming IS calling - retrieve and "execute"
value_str = self.house[name]
# The execution is the retrieval (inverted!)
self.execute.append(f"{name}={value_str}")
self._stroke('EXECUTE', {'name': name})
return value_str
def roll_fold(self) -> str:
"""ROLL 3: Fold everything into single scrambled string."""
self.state = "FOLDING"
# 1. Collect all current state
state_dump = {
'house': self.house,
'execute': self.execute,
'rotation': self.rotation
}
# 2. Serialize to string (objects ARE strings)
state_str = json.dumps(state_dump)
# 3. Derive scramble key
key = self._scramble_key()
# 4. Scramble
scrambled = bytes(ord(c) ^ k for c, k in zip(
state_str,
(key * ((len(state_str) // len(key)) + 1))[:len(state_str)]
))
# 5. Encode as string (objects ARE strings)
self.fold = base64.b64encode(scrambled).decode()
# 6. DELETE old state
self.house.clear()
self.execute.clear()
# 7. INCREMENT rotation
self.rotation += 1
# 8. SAVE fold as new house entry
self.house[f"fold_{self.rotation}"] = self.fold
self.state = "READY"
self._stroke('FOLD', {
'rotation': self.rotation,
'fold_size': len(self.fold)
})
return self.fold
def _scramble_key(self) -> bytes:
"""XOR 73 key derivation."""
seed = f"{self.genesis}:{self.rotation}"
raw_key = hashlib.sha256(seed.encode()).digest()
return bytes(b ^ 73 for b in raw_key)
def _stroke(self, event_type: str, data: Dict):
"""Append to JSONL stroke buffer."""
stroke = {
'type': event_type,
'chain_id': self.chain_id,
'rotation': self.rotation,
'state': self.state,
'timestamp': time.time(),
**data
}
self.strokes.append(json.dumps(stroke))
def get_jsonl(self) -> str:
"""Get JSONL output for Rust runtime."""
return '\n'.join(self.strokes)
def to_rust_env(self) -> str:
"""
Export as Rust environment variables format.
This is what "JSONL strokes Rust runtime" means.
"""
lines = []
lines.append(f'pub const CODIE_CHAIN_ID: &str = "{self.chain_id}";')
lines.append(f'pub const CODIE_GENESIS: &str = "{self.genesis}";')
lines.append(f'pub const CODIE_ROTATION: u64 = {self.rotation};')
if self.fold:
lines.append(f'pub const CODIE_FOLD: &str = "{self.fold}";')
# House as static map
lines.append('pub static CODIE_HOUSE: phf::Map<&str, &str> = phf_map! {')
for name, value in self.house.items():
escaped = value.replace('"', '\\"')
lines.append(f' "{name}" => "{escaped}",')
lines.append('};')
return '\n'.join(lines)
""" JSONL STROKE FORMAT - What Rust runtime consumes:
{"type":"BOOT","host":"file.codie","genesis":"abc123","rotation":0,"timestamp":1234567890.0} {"type":"INJECT","name":"foo","value":"{...}","timestamp":1234567890.1} {"type":"INVOKE","name":"foo","result":"42","timestamp":1234567890.2} {"type":"FOLD_PHASE_1","executed":["foo","bar"],"timestamp":1234567890.3} {"type":"FOLD_PHASE_2","deleted":["foo","bar"],"timestamp":1234567890.4} {"type":"FOLD_PHASE_3","rotation":1,"folded_name":"fold_1","scrambled":true,"timestamp":1234567890.5} {"type":"SHUTDOWN","final_rotation":1,"timestamp":1234567890.6}
Rust reads line-by-line, dispatches to handlers: BOOT → Initialize environment INJECT → Add to hashmap INVOKE → Call function pointer FOLD_* → Execute fold protocol SHUTDOWN → Cleanup and persist """
RUST_DISPATCHER_TEMPLATE = ''' // AUTO-GENERATED CODIE RUST DISPATCHER use std::collections::HashMap;
pub struct CodieRuntime { genesis: String, rotation: u64, house: HashMap<String, String>, }
impl CodieRuntime { pub fn from_jsonl(jsonl: &str) -> Self { let mut runtime = Self { genesis: String::new(), rotation: 0, house: HashMap::new(), };
for line in jsonl.lines() {
if let Ok(event) = serde_json::from_str::<serde_json::Value>(line) {
match event["type"].as_str() {
Some("BOOT") => {
runtime.genesis = event["genesis"].as_str().unwrap_or("").to_string();
runtime.rotation = event["rotation"].as_u64().unwrap_or(0);
}
Some("HOUSE") => {
if let (Some(name), Some(value)) = (
event["name"].as_str(),
event["value"].as_str()
) {
runtime.house.insert(name.to_string(), value.to_string());
}
}
Some("FOLD") => {
runtime.rotation = event["rotation"].as_u64().unwrap_or(runtime.rotation);
}
_ => {}
}
}
}
runtime
}
pub fn invoke(&self, name: &str) -> Option<&String> {
// NAMING IS CALLING
self.house.get(name)
}
} '''
def demonstrate_inversion(): """Show how CODIE inverts normal programming."""
print("╔═══════════════════════════════════════════════════════════════════════════════════════╗")
print("║ CODIE INVERSION DEMONSTRATION ║")
print("╚═══════════════════════════════════════════════════════════════════════════════════════╝")
print()
# ═══ 1. NAMES ARE FUNCTIONS ═══
print("═══ INVERSION 1: NAMES ARE FUNCTIONS ═══")
print()
print("Normal: def foo(): return 42; foo()")
print("CODIE: NAMES.foo = Const(42); NAMES.foo # naming IS calling")
print()
NAMES.foo = Const(42)
NAMES.bar = Const(145, transform="xor73")
print(f" NAMES.foo → {NAMES.foo}") # Returns 42
print(f" NAMES.bar → {NAMES.bar}") # Returns 145 ^ 73 = 72
print()
# ═══ 2. FILES ARE HOSTS ═══
print("═══ INVERSION 2: FILES ARE HOSTS ═══")
print()
print("Normal: open('file.txt', 'w').write(data)")
print("CODIE: FileHost('file.codie').boot().inject('data', Const(...))")
print()
host = FileHost('/tmp/demo.codie').boot("genesis_abc123")
host.inject("counter", Const(0))
host.inject("message", Const("hello world"))
print(f" Host booted: {host.filepath}")
print(f" Environment: {list(host.environment.keys())}")
print()
# ═══ 3. FUNCTIONS ARE CONST ═══
print("═══ INVERSION 3: FUNCTIONS ARE CONST ═══")
print()
print("Normal: function that mutates state")
print("CODIE: Const that returns immutable value")
print()
hash_func = Const("input_data", transform="hash")
print(f" hash_func.execute() → {hash_func.execute()[:32]}...")
print(f" (same input always same output - it's a constant)")
print()
# ═══ 4. OBJECTS ARE STRINGS ═══
print("═══ INVERSION 4: OBJECTS ARE STRINGS ═══")
print()
print("Normal: obj = {'key': 'value'} # structured data")
print("CODIE: obj = '{\"key\": \"value\"}' # everything is string")
print()
obj = {"complex": [1, 2, 3], "nested": {"deep": True}}
string_obj = objectify(obj)
print(f" Original: {obj}")
print(f" Stringified: {string_obj}")
print(f" Restored: {deobjectify(string_obj)}")
print()
# ═══ 5. 3-WAY ROLLING ═══
print("═══ 3-WAY ROLLING: HOUSE → EXECUTE → FOLD ═══")
print()
chain = RollingChain(
chain_id="demo_chain",
genesis="abc123def456"
)
# ROLL 1: House
print("ROLL 1 - HOUSING:")
chain.roll_house("x", 100)
chain.roll_house("y", 200)
chain.roll_house("z", {"nested": "data"})
print(f" Housed: {list(chain.house.keys())}")
print()
# ROLL 2: Execute
print("ROLL 2 - EXECUTING:")
result_x = chain.roll_execute("x")
result_y = chain.roll_execute("y")
print(f" Executed x → {result_x}")
print(f" Executed y → {result_y}")
print(f" Execution log: {chain.execute}")
print()
# ROLL 3: Fold
print("ROLL 3 - FOLDING:")
print(f" Before fold - rotation: {chain.rotation}, house size: {len(chain.house)}")
folded = chain.roll_fold()
print(f" After fold - rotation: {chain.rotation}, house size: {len(chain.house)}")
print(f" Folded value: {folded[:50]}...")
print(f" House now contains: {list(chain.house.keys())}")
print()
# ═══ 6. JSONL OUTPUT ═══
print("═══ JSONL STROKES (For Rust Runtime) ═══")
print()
jsonl = chain.get_jsonl()
for line in jsonl.split('\n')[:5]:
print(f" {line}")
print(" ...")
print()
# ═══ 7. RUST EXPORT ═══
print("═══ RUST ENVIRONMENT EXPORT ═══")
print()
rust_env = chain.to_rust_env()
for line in rust_env.split('\n')[:6]:
print(f" {line}")
print()
# Shutdown host
host.shutdown()
print(f"═══ Host shutdown, JSONL written to: {host.filepath} ═══")
def full_cycle(): """ Complete CODIE cycle: 1. File becomes host 2. Names become functions 3. Functions become const 4. Objects become strings 5. 3-way roll 6. Output JSONL 7. Stroke Rust """
print()
print("╔═══════════════════════════════════════════════════════════════════════════════════════╗")
print("║ FULL CODIE CYCLE ║")
print("╚═══════════════════════════════════════════════════════════════════════════════════════╝")
print()
# Create chain
chain = RollingChain(
chain_id="full_cycle_001",
genesis=hashlib.sha256(b"genesis_seed").hexdigest()
)
# Simulate multiple fold cycles
for cycle in range(3):
print(f"═══ CYCLE {cycle + 1} ═══")
# House some data
chain.roll_house(f"data_{cycle}", {"cycle": cycle, "value": cycle * 100})
chain.roll_house(f"xor_test_{cycle}", 145 + cycle)
# Execute (naming = calling)
chain.roll_execute(f"data_{cycle}")
# Fold (delete → save → scramble)
chain.roll_fold()
print(f" Rotation: {chain.rotation}")
print(f" House keys: {list(chain.house.keys())}")
print()
# Final JSONL output
print("═══ FINAL JSONL OUTPUT ═══")
print()
print(chain.get_jsonl())
print()
# Rust export
print("═══ RUST RUNTIME EXPORT ═══")
print()
print(chain.to_rust_env())
if name == 'main': demonstrate_inversion() full_cycle()
The filename IS the program.
.genesis.eva<>water.eva_1.magician<(build search button)>build<button>>.out
│ │ │ │ │ │ │ │
│ │ │ │ │ │ │ └─ OUTPUT
│ │ │ │ │ │ └──────── TARGET
│ │ │ │ │ └─────────────── ACTION
│ │ │ │ └────────────────────────────────────── SCOPE
│ │ │ └─────────────────────────────────────────────── DIMENSION
│ │ └────────────────────────────────────────────────────── EVA
│ └──────────────────────────────────────────────────────────── ELEMENT
└───────────────────────────────────────────────────────────────────────── GENESIS
Tables are vocabulary. Grammar is syntax. Execution is inverted.
Everything folds into itself, deletes, saves, and rescrambles.
"The Fool walks backwards. The nonce is on the circumference."
"The devtools build the production. The production contains the devtools. The cycle is complete."