StoryBlocks is an AI-powered creative writing platform designed to help writers overcome blocks, develop story ideas, and build full novellas or novels one "block" at a time. It turns storytelling into a modular, interactive experience powered by language models like OpenRouter and local LLMs, making it perfect for writers who want structure, inspiration, or just a creative push.
Whether you're a novelist struggling with writer's block, a game designer building branching narratives, or just someone who wants to experiment with interactive storytelling, StoryBlocks provides a seamless interface to generate, manage, and evolve your ideas into complete narratives.
Each StoryBlock represents a scene or decision point, helping you sculpt your narrative path piece by piece. Choose from different genres and styles, guide the AI with prompts, and shape your story as a linear tale or branching epic, once your done you can export them all in one compiled text file / html file..
Break through creative blocks
Let the AI give you a head start when the blank page feels too heavy.
Modular storytelling
Build stories block-by-block, making editing and idea expansion easier.
Branching narratives
Create interactive fiction with two-choice decision trees, perfect for choose-your-own-adventure formats.
Publish-ready pipeline (coming soon)
After writing, submit your story to the public StoryBlocks Feature Site, where others can vote, comment, and explore your work.
AI-generated book covers
Generate stunning book cover art using AI image generation tools.
Future: On-chain storytelling
Finalized works can be minted immutably to the blockchain (optional), allowing you to preserve and share your stories forever.
StoryBlocks isn’t just a tool. It’s a creative companion designed to help you think less about the struggle of writing and more about the joy of storytelling.
Simple node service to generate story blocks using OpenRouter LLM or an optional local LLM. Other providers to be added later on, maybe OpenAI directly will be added, as well as universall memory module handler (tm)
npm install
cp .env.example .env # edit with your API key
npm start
# or directly
node index.js
POST /api/blocks/:id/generatewith JSON{ "prompt": "..." }. Requires user to have credits. Generated text is stored in SQLitestoryblocks.db.
Currently mainly using OpenRouter for generation, and special testing via LocalLLM.
Set OPENROUTER_API_KEY in .env to use OpenRouter. If USE_LOCAL_LLM=true,
requests are sent to http://localhost:8000/generate instead.
StoryBlocks is an interactive AI-driven story creation tool. Narratives are built one "StoryBlock" at a time, with each block generated by a large language model and stored in a persistent SQLite database. Users create stories, generate blocks, choose branching options, and can later export or even publish their work to the blockchain.
The project demonstrates how AI generation can power choose-your-own-adventure experiences. A simple editor lets writers manage stories and blocks while the backend tracks storage and credit usage. OpenRouter or a local LLM produces the content, ensuring each block adheres to the selected genre, style, and perspective. Eventually stories will be shareable on a public portal and immutably stored on chain.
-
JWT-based authentication with email verification (optional admin can be seeded via
ADMIN_EMAILandADMIN_PASSWORD) -
SQLite persistence for users, stories, blocks, and choices
-
Storage and credit limits per user, with APIs to check usage
-
Editor UI for creating stories and generating blocks with two short choices
-
LLM integration via OpenRouter with optional local fallback
-
Pages per block selectable from 1–10; generation retries if output is too short
-
Export stories to text or HTML including only your chosen branches
- Install dependencies:
npm install
- Copy
.env.exampleto.envand fill in SMTP,JWT_SECRET, and OpenRouter settings. - Start the server:
npm start
- Visit
http://localhost:3000to see the welcome page with links to log in or register.
See .env.example for required variables. JWT_SECRET is mandatory. You can also configure DEFAULT_STORAGE_LIMIT (story blocks per user), DEFAULT_USER_CREDITS (LLM generations) and DEFAULT_TEMPERATURE (LLM sampling). Admin users have unlimited storage and credits. An admin account can be seeded by providing ADMIN_EMAIL and ADMIN_PASSWORD. The optional OPENROUTER_MODELS variable can list comma-separated model IDs for the preferences page.
To enable credit purchases, set BOOST_CHECKOUT_URL to your Stripe payment link. When users complete payment they receive 250 credits and 100 additional storage slots.
Authenticated clients can query /api/user/storage for current block usage and /api/user/stats for remaining credits.
Set OPENROUTER_API_KEY and OPENROUTER_MODEL in your .env file to enable LLM generation. The /api/blocks/:id/generate and /api/blocks/:id/choices routes contact OpenRouter to produce content and choices. The server checks generated length (roughly 200 words per page) and requests the LLM to expand if needed.
For quick testing you can start the server without permanently storing your OpenRouter API key. Execute:
./test_server.shThe script loads variables from .env (if present), then prompts for the API
key and launches the server with NODE_ENV=test. The key is passed only to the
running process and never written to disk.
GET /api/stories– List all storiesPOST /api/stories– Create a new storyGET /api/stories/:id– Retrieve a storyPUT /api/stories/:id– Update a storyDELETE /api/stories/:id– Delete a story
GET /api/stories/:id/blocks– List blocks for a storyPOST /api/stories/:id/blocks– Create a blockPUT /api/blocks/:id– Update a blockDELETE /api/blocks/:id– Delete a blockPOST /api/blocks/:id/generate– Generate block content (uses a credit)POST /api/blocks/:id/choices– Generate or add choicesPOST /api/blocks/:id/continue– Continue from a choice
GET /api/user– Get current user profileGET /api/user/storage– Storage usageGET /api/user/stats– Remaining creditsGET /api/models– Available LLM models