Skip to content

Commit

Permalink
formatting
Browse files Browse the repository at this point in the history
  • Loading branch information
Bam4d committed Dec 9, 2023
1 parent 85d0f5e commit 12e50d1
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion src/mistralai/async_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -274,7 +274,7 @@ async def chat_stream(
[{role: 'user', content: 'What is the best French cheese?'}]
temperature (Optional[float], optional): temperature the temperature to use for sampling, e.g. 0.5.
max_tokens (Optional[int], optional): the maximum number of tokens to generate, e.g. 100. Defaults to None.
top_p (Optional[float], optional): the cumulative probability of tokens to generate, e.g. 0.9.
top_p (Optional[float], optional): the cumulative probability of tokens to generate, e.g. 0.9.
Defaults to None.
random_seed (Optional[int], optional): the random seed to use for sampling, e.g. 42. Defaults to None.
safe_mode (bool, optional): whether to use safe mode, e.g. true. Defaults to False.
Expand Down
4 changes: 2 additions & 2 deletions src/mistralai/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ def chat(
[{role: 'user', content: 'What is the best French cheese?'}]
temperature (Optional[float], optional): temperature the temperature to use for sampling, e.g. 0.5.
max_tokens (Optional[int], optional): the maximum number of tokens to generate, e.g. 100. Defaults to None.
top_p (Optional[float], optional): the cumulative probability of tokens to generate, e.g. 0.9.
top_p (Optional[float], optional): the cumulative probability of tokens to generate, e.g. 0.9.
Defaults to None.
random_seed (Optional[int], optional): the random seed to use for sampling, e.g. 42. Defaults to None.
safe_mode (bool, optional): whether to use safe mode, e.g. true. Defaults to False.
Expand Down Expand Up @@ -159,7 +159,7 @@ def chat_stream(
[{role: 'user', content: 'What is the best French cheese?'}]
temperature (Optional[float], optional): temperature the temperature to use for sampling, e.g. 0.5.
max_tokens (Optional[int], optional): the maximum number of tokens to generate, e.g. 100. Defaults to None.
top_p (Optional[float], optional): the cumulative probability of tokens to generate, e.g. 0.9.
top_p (Optional[float], optional): the cumulative probability of tokens to generate, e.g. 0.9.
Defaults to None.
random_seed (Optional[int], optional): the random seed to use for sampling, e.g. 42. Defaults to None.
safe_mode (bool, optional): whether to use safe mode, e.g. true. Defaults to False.
Expand Down

0 comments on commit 12e50d1

Please sign in to comment.