Skip to content

Commit

Permalink
fix: decrease max_tokens to 1024
Browse files Browse the repository at this point in the history
The previous value of 8192 is too large and causes the following error: "max_tokens is too large: 8192. This model supports at most 4096 completion tokens, whereas you provided 8192."

max_tokens indicates the number of tokens in the *output* and input + max_tokens < context length for model. 1024 should be a safe value to use and sufficiently large for creating even the most advanced CLI commands.

Resolves: #28
  • Loading branch information
Realiserad committed Jun 19, 2024
1 parent 318d2d2 commit 807f464
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/fish_ai/engine.py
Original file line number Diff line number Diff line change
Expand Up @@ -218,7 +218,7 @@ def get_response(messages):
else:
completions = get_openai_client().chat.completions.create(
model=get_config('model') or 'gpt-4',
max_tokens=8192,
max_tokens=1024,
messages=messages,
stream=False,
temperature=float(get_config('temperature') or '0.2'),
Expand Down

0 comments on commit 807f464

Please sign in to comment.