Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: Custom OpenAI API URL endpoint #484

Open
DutchEllie opened this issue Nov 11, 2023 · 3 comments
Open

Feature request: Custom OpenAI API URL endpoint #484

DutchEllie opened this issue Nov 11, 2023 · 3 comments

Comments

@DutchEllie
Copy link

Several services have started to make the OpenAI API standard their own. Not only if you (for some unknown reason) want to use Azure OpenAI Service, but most notably Oobabooga has recently migrated their API to the OpenAI standard.

Currently, the endpoint is fixed in code, but a simple edit should make it editable.

@henk717
Copy link
Owner

henk717 commented Nov 11, 2023

Currently the priority of United has been local generations, so while nice to have we are currently focusing on improving / fixing the locally run backends instead of adding an online one.

For online backends I recommend using https://lite.koboldai.net which supports this and runs entirely in your browser. If you want something you can keep offline you can view and download its source here : https://github.com/LostRuins/lite.koboldai.net/blob/main/index.html

The existing OpenAI backend we have locally is so outdated it needs to be replaced entirely at this point, since its a much older standard that I think only GooseAI still uses at this point. So unfortunately not a simple edit to expose the field.

@DutchEllie
Copy link
Author

Ah, okay. I guess that's why it didn't work when I just edited the field. I was trying to mess with the code to see if I could just shoot in a PR myself, seemed like a simple thing to fix. However, I see that it's also not working because of the outdated standard.

Well, I tried looking at the code myself to see if I could implement it somehow, but it's going way over my head as expected. Thanks for the recommendation of lite.koboldai.net. Maybe I'll try that or see if I can somehow load my GPTQ models from Ooba in your KoboldAI program instead.

@henk717
Copy link
Owner

henk717 commented Nov 12, 2023

We support GPTQ out of the box, so all you have to do is move the folder over to our models folder. No need for ooba.
Once you load from the folder you can change Huggingface to Exllama for extra speed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants