A simple, robust, and composable data transformation library for TypeScript/JavaScript.
This library is built on a foundation of three core principles:
- Unix Philosophy: Each part of this library is a small, sharp tool that does one thing well. We compose these simple, predictable tools to build complex and reliable behavior.
- Parse, Don't Validate: We never trust data. We force it through a parser that returns a new, compiler-verified value. This eliminates an entire class of bugs at the source.
- Simplicity over Ease: We prioritize correctness and predictability. Logic is treated as data (JSON Schema, JSONLogic rules), which makes the system transparent and configurable.
- Type-Safe Transformations: Ensure your data transformations are type-safe from end to end.
- Bi-Directional: Decode and encode data between two different shapes.
- Composable: Build complex pipelines from simple, reusable functions.
- Declarative: Define your transformation logic as data using JSON Schema and JSON Logic.
- Self-Testing: Verify your data contracts with the built-in verification function.
npm installHere's a simple example of how to use the library to decode and encode data.
import { processPacket, verifyPacket } from "./src/packet-processor";
import { readFileSync } from "fs";
// 1. Load a data contract packet
const packet = JSON.parse(readFileSync("./sample-packet.json", "utf-8"));
// 2. Create a transformer from the packet
const transformer = processPacket(packet);
// 3. Verify the packet (optional)
const isValid = verifyPacket(packet);
console.log(`Packet is ${isValid ? "valid" : "invalid"}`);
// 4. Decode data
const decoded = transformer.decode({ firstName: "John", lastName: "Smith" });
console.log(decoded); // { fullName: 'John Smith' }
// 5. Encode data
const encoded = transformer.encode({ fullName: "Jane Smith" });
console.log(encoded); // { firstName: 'Jane', lastName: 'Smith' }A Data Contract Packet is a JSON object that contains everything needed to perform a bi-directional transformation:
- inputSchema: The JSON Schema for the input data.
- outputSchema: The JSON Schema for the output data.
- decoderRules: The JSON Logic rules for decoding the data (input -> output).
- encoderRules: The JSON Logic rules for encoding the data (output -> input).
- data: An array of sample data that can be used to verify the packet.
This packet is a self-contained unit that can be easily shared and tested.
To see the library in action, run the demo script:
bun run demo.tsThis will process three different data packets: a simple one, a complex one, and an invalid one.
A CLI is available via the schemaflow entrypoint. It supports both Bun and Node (with a built JS bundle). New flags were added:
- --help,- -h— show help
- --version— show package version
- --verbose— show detailed error output / stack traces
- -o, --output <file>— write transformed output to a file
- -c, --config <file>— load JSON configuration file (defaults to- .ajv-decoderrcin project root)
Basic usage:
- 
Verify a packet file: - bun: bun run src/cli.ts /path/to/packet.json
- node (after build): node dist/cli.js /path/to/packet.json
 
- bun: 
- 
Decode a JSON payload (positional): - bun run src/cli.ts /path/to/packet.json '{"foo":"bar"}'
 
- 
Decode from stdin: - cat data.json | bun run src/cli.ts /path/to/packet.json
 
- 
Write output to a file: - bun run src/cli.ts -o out.json /path/to/packet.json '{"foo":"bar"}'
 
Configuration:
- Place a JSON file named .ajv-decoderrcin the project root or pass--config <file>to load options used by the CLI.
Testing:
- The CLI exposes an exported runCli(argv?, transformFn?)function used by tests. Tests inject a smalltransformFnto avoid loading the full packet processing pipeline and to make CLI behavior deterministic.
npx / node usage:
- To run via npx, produce a Node-compatible build first (seedocs/cli.md) and ensurebinpoints to the built JS file.
For a detailed API reference, please see the API.md file.
For guidelines on how to contribute to this project, please see the GEMINI.md file.