Skip to content

Conversation

@alappe
Copy link
Contributor

@alappe alappe commented Sep 17, 2024

I noticed that https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-chat-completion and https://github.com/ollama/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values document that those options all need to be passed to the options: {} sub structure.

My local instance testing with llama v3.1 and ollama v0.3.10 complains because num_gqa is unknown. I left it in there for now because other models might need it.

@brainlid
Copy link
Owner

brainlid commented Mar 6, 2025

Hi @alappe! I assume this is still a needed change, right? The doc links you provided sure make it appear that way. (Thanks for those, btw!)

@mustela, your recent note here leads me to believe you're successfully using the library with Ollama. Are you doing it with or without this change?

I'm trying to understand if making this change breaks backward compatibility or if this is just how it must be done going forward. Thanks!

@alappe
Copy link
Contributor Author

alappe commented Mar 7, 2025

Hi @brainlid,

yes, this is still needed. It's still possible to use the library with ollama at the moment, because if you're not providing the options (in the right place), the defaults apply. But you cannot change options without the change. IMHO it breaks nothing because the current state just didn't work right with ollama.

@alappe alappe force-pushed the ollama_fix_options branch from 6ad8e91 to 471951c Compare March 13, 2025 09:29
@alappe
Copy link
Contributor Author

alappe commented Mar 13, 2025

@brainlid I rebased the PR…

@brainlid brainlid merged commit 4e3ffe7 into brainlid:main Apr 5, 2025
2 checks passed
@brainlid
Copy link
Owner

brainlid commented Apr 5, 2025

Thanks @alappe! I'm trusting that this is working well as I'm not actively testing Ollama. I appreciate the help improving the library!
❤️💛💙💜

brainlid added a commit that referenced this pull request Apr 12, 2025
* main:
  Support for file with file_id in ChatOpenAI (#283)
  Fix options being passed to the ollama chat api (#179)
  Support for json_response in ChatModels.ChatGoogleAI (#277)
  support streaming responses from mistral (#287)
  feat: File urls for Google (#286)
  added ChatPerplexity to the docs - updated ContentPart docs
brainlid added a commit that referenced this pull request Apr 12, 2025
* main:
  Support for file with file_id in ChatOpenAI (#283)
  Fix options being passed to the ollama chat api (#179)
  Support for json_response in ChatModels.ChatGoogleAI (#277)
  support streaming responses from mistral (#287)
  feat: File urls for Google (#286)
  added ChatPerplexity to the docs - updated ContentPart docs
brainlid added a commit that referenced this pull request Apr 12, 2025
* main:
  Support for file with file_id in ChatOpenAI (#283)
  Fix options being passed to the ollama chat api (#179)
  Support for json_response in ChatModels.ChatGoogleAI (#277)
  support streaming responses from mistral (#287)
  feat: File urls for Google (#286)
  added ChatPerplexity to the docs - updated ContentPart docs
brainlid added a commit that referenced this pull request Apr 23, 2025
* main:
  updated readme for version install
  updated mix.exs version for release
  updated changelog for v0.3.3
  added content part description to OpenAI module doc for file uploads
  fixed doc typo
  adds telemetry (#284)
  check that the requested tool_name exists - return an error if it does not exist in the chain
  added LLMChain.run_until_tool_used/3 (#292)
  Support for file with file_id in ChatOpenAI (#283)
  Fix options being passed to the ollama chat api (#179)
  Support for json_response in ChatModels.ChatGoogleAI (#277)
  support streaming responses from mistral (#287)
  feat: File urls for Google (#286)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants