Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Correctly set _GLIBCXX_USE_CXX11_ABI pybind compile options #6744

Merged
merged 5 commits into from
Nov 9, 2024

Conversation

huydhn
Copy link
Contributor

@huydhn huydhn commented Nov 9, 2024

I finally understand why pytorch/pytorch#139947 (comment) passed on PyTorch nightly binaries while failed on those that were built from source on CI. It turns out that _GLIBCXX_USE_CXX11_ABI is always unset when building ET pybind. For some reason what I'm not sure, PyTorch nightly binaries unset _GLIBCXX_USE_CXX11_ABI while PyTorch CI sets it, maybe @malfet has some context here. This explains why the test couldn't find these quantized symbol. The missing torchao and torchtune was a red herring.

As ET already has EXECUTORCH_DO_NOT_USE_CXX11_ABI option, the fix here is just to check for this and set _GLIBCXX_USE_CXX11_ABI accordingly.

Copy link

pytorch-bot bot commented Nov 9, 2024

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/6744

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 51d6fba with merge base 785ebf3 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 9, 2024
CMakeLists.txt Outdated Show resolved Hide resolved
@malfet
Copy link
Contributor

malfet commented Nov 9, 2024

As to reason why this is necessary: Until 2.6 PyTorch was targeting manylinux2014, which still ships with CXX library that does not support C++11 ABI, so wheel binaries where always build with pre-cxx11 ABI (but conda with it, as conda ships newer libstdc++)

huydhn and others added 2 commits November 8, 2024 19:15
Co-authored-by: Nikita Shulga <2453524+malfet@users.noreply.github.com>
@huydhn
Copy link
Contributor Author

huydhn commented Nov 9, 2024

As to reason why this is necessary: Until 2.6 PyTorch was targeting manylinux2014, which still ships with CXX library that does not support C++11 ABI, so wheel binaries where always build with pre-cxx11 ABI (but conda with it, as conda ships newer libstdc++)

Thank you for the explanation! Now that makes sense, I vaguely remember reading about this before.

@huydhn
Copy link
Contributor Author

huydhn commented Nov 9, 2024

Going to test this out on pytorch/pytorch#140199 to wait for a green CI signal before landing. I have been able to verified this locally

@huydhn
Copy link
Contributor Author

huydhn commented Nov 9, 2024

@huydhn huydhn merged commit 289e84e into main Nov 9, 2024
39 checks passed
@huydhn huydhn deleted the set-_GLIBCXX_USE_CXX11_ABI branch November 9, 2024 06:28
pytorchmergebot pushed a commit to pytorch/pytorch that referenced this pull request Nov 11, 2024
This will be updated to ET trunk commit after pytorch/executorch#6744 lands.  I also move ET back from unstable and install llama3 dependencies
Pull Request resolved: #140199
Approved by: https://github.com/kit1980
zhangxiaoli73 pushed a commit to zhangxiaoli73/pytorch that referenced this pull request Nov 13, 2024
This will be updated to ET trunk commit after pytorch/executorch#6744 lands.  I also move ET back from unstable and install llama3 dependencies
Pull Request resolved: pytorch#140199
Approved by: https://github.com/kit1980
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants