Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

vertexai: Add "mistral-large-2411@001" to model garden maas #663

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

fjeanchar2
Copy link

Description :

πŸ†• New Feature : Add Mistral's latest model"mistral-large-2411@001" to model garden maas.

Tests :

βœ… Test

from langchain_google_vertexai.model_garden_maas.mistral import VertexModelGardenMistral
 
model = VertexModelGardenMistral(
    model_name="mistral-large-2411@001",
    max_output_tokens=8000,
    top_k=10,
    temperature=0.0,
    location="europe-west4",
    project="sandbox-fjeancharles",
)

model.invoke("hello")
output: AIMessage(content="Hello! How can I assist you today? Let's have a friendly conversation. 😊 How are you doing?", additional_kwargs={}, response_metadata={'token_usage': {'prompt_tokens': 5, 'total_tokens': 29, 'completion_tokens': 24}, 'model': 'mistral-large-2411@001', 'finish_reason': 'stop'}, id='run-3ab72497-c2eb-44e0-bd34-ddefeb315bf8-0', usage_metadata={'input_tokens': 5, 'output_tokens': 24, 'total_tokens': 29})

@lkuligin
Copy link
Collaborator

integration tests are failing unfortunately. are you sure the model has the same output format as the previous one?

@fjeanchar2
Copy link
Author

integration tests are failing unfortunately. are you sure the model has the same output format as the previous one?

I haven't tested the streaming part, but I'll look into it! I let you know

@fjeanchar2
Copy link
Author

fjeanchar2 commented Dec 27, 2024

integration tests are failing unfortunately. are you sure the model has the same output format as the previous one?

@lkuligin a fix has been made in version 0.2.4 of the langchain-mistralai package in the chat_models.py script (L600 and L626) concerning the error found in the stream and astream method (see test details).
https://github.com/langchain-ai/langchain/blob/master/libs/partners/mistralai/langchain_mistralai/chat_models.py

I can see in the poetry.lock file that you are with the 0.2.3 of langchain-mistralai package.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants