Is there a way to make the CLI show the whole response instead of one word at a time? #157
-
Thank you for the project. The CLI works great, but it has this effect just like https://chat.openai.com/chat that the response appears slowly, one word at a time. Can we make it show the response all at once? |
Beta Was this translation helpful? Give feedback.
Answered by
waylaidwanderer
Mar 4, 2023
Replies: 1 comment 2 replies
-
It's shown to you as the AI model generates the token, so no. You could disable streaming but it would take the same amount of time. |
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
waylaidwanderer
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
It's shown to you as the AI model generates the token, so no. You could disable streaming but it would take the same amount of time.