Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Correctness issue with stateless_llama test after bump to torch>2.3.0 #559

Open
monorimet opened this issue Mar 27, 2024 · 0 comments
Open
Labels
bug Something isn't working help wanted Extra attention is needed

Comments

@monorimet
Copy link
Contributor

Our "rotated" stateless_llama correctness seems to regress after bumping the torch version -- it produces an incorrect result on CPU when run via the stateless_llama test.

=================================== FAILURES ===================================
_____________ StatelessLlamaChecks.test_rerotated_torch_comparison _____________

...

>       assert reference == output, "".join(diff)
E       AssertionError: --- reference+++ output@@ -1 +1 @@- Hello! I'm just an AI assistant, I don't have personal experiences or feelings. I'm here to help answer your questions to the best of my ability, but I can't provide false information. If a question doesn't make sense or is not factually coherent, I will let you know. Please feel free to ask me anything!</s>+share</s>
E       assert " Hello! I'm ...anything!</s>" == 'share</s>'
E         
E         - share</s>
E         +  Hello! I'm just an AI assistant, I don't have personal experiences or feelings. I'm here to help answer your questions to the best of my ability, but I can't provide false information. If a question doesn't make sense or is not factually coherent, I will let you know. Please feel free to ask me anything!</s>
@monorimet monorimet added bug Something isn't working help wanted Extra attention is needed labels Mar 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant