可以部署到docker和云函数的OpenAI API代理 Simple proxy for OpenAi api via a one-line docker command
🌳 如果你懒得自己搭建,那么可以试试国内可以访问、可以微信充值的第三方OpenAI API服务:API2D.com,支持Chat酱、OpenCat、NextWeb、VSCode插件。
- 腾讯云函数部署教程 🔥 腾讯云函数从4月25日起已经全地域支持SSE,推荐使用
- 简体中文使用说明
- 《如何快速开发一个OpenAI/GPT应用:国内开发者笔记》
🎉 已经支持SSE,可以实时返回内容
以下英文由GPT翻译。The following English was translated by GPT.
- Supports SSE streaming output
- Built-in text moderation (requires Tencent Cloud KEY configuration)
- 💪 SSE streaming output supports text moderation, that's how powerful it is.
You can deploy ./app.js to any environment that supports nodejs 14+, such as cloud functions and edge computing platforms.
- Copy app.js and package.json to the directory
- Install dependencies with yarn install
- Start the service with node app.js
docker run -p 9000:9000 easychen/ai.level06.com:latest
The proxy address is http://${IP}:9000
- PORT: Service port
- PROXY_KEY: Proxy access key, used to restrict access
- TIMEOUT: Request timeout, default 30 seconds
- TENCENT_CLOUD_SID: Tencent Cloud secret_id
- TENCENT_CLOUD_SKEY: Tencent Cloud secret_key
- TENCENT_CLOUD_AP: Tencent Cloud region (e.g. ap-singapore Singapore)
- Change the domain/IP (with port number) of the openai request address in the original project (e.g. https://api.openai.com) to the domain/IP of this proxy.
- If PROXY_KEY is set, add
:<PROXY_KEY>
after the openai key. If not set, no modification is required. - moderation: true enables moderation, false disables moderation
- moderation_level: high interrupts all sentences whose moderation result is not Pass, low only interrupts sentences whose moderation result is Block.
- Only supports GET and POST methods, not file-related interfaces.
SSE is not currently supported, so stream-related options need to be turned offNow supported.
Using https://www.npmjs.com/package/chatgpt
as an example:
chatApi= new gpt.ChatGPTAPI({
apiKey: 'sk.....:<proxy_key_here>',
apiBaseUrl: "http://localhost:9001/v1", // Replace with proxy domain/IP
});
- SSE reference to chatgpt-api project related code