Build the full script generation pipeline using LLMs. Accepts user input + context, generates structured script output, and stores versioned content for reuse or editing.
Scope:
- Build backend service to invoke LLM with contextual prompts
- Create script-generation endpoint (POST /api/scripts)
- Connect UI to trigger generation (based on user input + config)
- Add metadata tagging (tone, type, tags, etc.)
- Start handling streaming or async generation if needed
- Version generated scripts (for later edits or history)
List view
0 issues of 0 selected
There are no open issues in this milestone
Add issues to milestones to help organize your work for a particular release or project. Find and add issues with no milestones in this repo.