Skip to content

Conversation

NicolasHug
Copy link
Member

Using the result of t_input.contiguous().data_ptr<T>(); is unsafe if t_input isn't contiguous in the first place, as the t_input.contiguous() tensor is undefined right after that block. Instead, we should keep the temporary tensors around.

Related internal post: https://fb.workplace.com/groups/1405155842844877/permalink/4735263356500759/

Similar past PR: #2131

CC @fmassa

@NicolasHug NicolasHug added bug module: ops module: models.quantization Issues related to the quantizable/quantized models labels May 24, 2021
Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, merge after green tests.

@NicolasHug
Copy link
Member Author

Thanks for the fast review!

@NicolasHug NicolasHug merged commit 9c31d1d into pytorch:master May 24, 2021
facebook-github-bot pushed a commit that referenced this pull request May 25, 2021
Reviewed By: vincentqb, cpuhrsch

Differential Revision: D28679963

fbshipit-source-id: dffae9aa3764472685d5b45be6e62949d8eb318d
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug cla signed module: models.quantization Issues related to the quantizable/quantized models module: ops

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants