Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix prompt removal logic and update test case #439

Merged

Conversation

venondev
Copy link
Contributor

Solves #438

@rlouf
Copy link
Member

rlouf commented Dec 15, 2023

Thank your for opening a PR! Is it problematic when streaming as well?

@venondev
Copy link
Contributor Author

Works fine in streaming mode. Probably as we do not have to remove the prompt.

@rlouf
Copy link
Member

rlouf commented Dec 16, 2023

I will need to rebase the merge commit, otherwise this looks good! Thank you for contributing

@rlouf rlouf force-pushed the fix/prompt-removal-from-output-sequence branch from 314e57b to 40f4f5d Compare December 16, 2023 14:01
@rlouf rlouf force-pushed the fix/prompt-removal-from-output-sequence branch from 40f4f5d to 516709d Compare December 16, 2023 14:03
@rlouf rlouf merged commit 1bc9cae into dottxt-ai:main Dec 16, 2023
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Using mistrals tokenizer and a start token in the prompt, results in a failed response
2 participants