Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unified generation outputs #6

Merged
merged 1 commit into from
Aug 6, 2024
Merged

Conversation

jonathanhecl
Copy link
Contributor

Unified generation outputs
And added the tokens in the response.

@jonathanhecl
Copy link
Contributor Author

Create a new tag for this version :)

@xyproto
Copy link
Owner

xyproto commented Aug 5, 2024

This is great, and thanks for the PR!

It is important to me that existing programs that uses ollamaclient can continue to use the same API.

So, instead of:

generatedOutput, err := oc.GetOutput(prompt)
[...]
output := generatedOutput.Response

And:

generatedOutput := oc.MustOutput(prompt)
[...]
output := generatedOutput.Response

Could perhaps oc.GetOutput and oc.MustOutput keep the same return types as before, but new methods be added?

For instance oc.GetResult and oc.MustResult?

@jonathanhecl
Copy link
Contributor Author

Yes, it can look like this. However, I am a bit worried that there are 3 endpoints in ollama to get answers, and I don't know when to use one or the other.

/chat/completions (brings "choices", in theory it is for a string of messages)
/api/generate (this is the one originally used for text, but also returns the prompt tokens)
/api/chat (used for the tools)

Of these we only use the final two.

@xyproto
Copy link
Owner

xyproto commented Aug 6, 2024

As far as I can tell /chat/completions is only for OpenAI compatibility, so I'm not too worried about not supporting this endpoint, unless it has advantages or possibilities that the other endpoints do not have: https://ollama.com/blog/openai-compatibility

@xyproto xyproto merged commit abd7b76 into xyproto:main Aug 6, 2024
1 check passed
@xyproto
Copy link
Owner

xyproto commented Aug 6, 2024

I merged and then adjusted the PR to keep backward compatibility for existing programs and packages that uses the ollamaclient package.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants