forked from firecrawl/open-lovable
    
        
        - 
                Notifications
    You must be signed in to change notification settings 
- Fork 0
Closed
Description
Goals
- Provide a thin OpenAI client that reads configuration from environment variables.
- Expose a streaming chat endpoint at /api/ai/chatwith raw text output using Edge runtime.
- Expose a health endpoint at /api/healthreporting status of the OpenAI service.
- Update documentation with required environment variables and show example usage.
Acceptance Criteria
-  A new file lib/ai/openai.tsimplementsstreamChat()usingOPENAI_API_KEY,OPENAI_MODEL,OPENAI_BASE_URLandAI_PROVIDER=openaienvironment variables.
-  A new API route /api/ai/chatin the Edge runtime accepts chat messages{ messages, model?, temperature? }and streams back plain text output. MissingAI_PROVIDER=openaireturns an error.
-  A new API route /api/healthreturns JSON{ status, services: { openai: { ok, model, error? }}}; it usesOPENAI_API_KEYto validate connectivity with OpenAI, and setsok: falsewith a helpful error when the key is missing or invalid.
-  The README explains required environment variables (AI_PROVIDER,OPENAI_API_KEY,OPENAI_MODEL,OPENAI_BASE_URL) and includes simple instructions for streaming chat and health check.
-  The feature does not crash the app when secrets are missing; openai.okis set to false with an error in/api/health.
-  Include screenshots of the new /api/healthendpoint (showing both successful and missing-key responses) and a sample streamed chat in the PR description.
Metadata
Metadata
Assignees
Labels
No labels