-
-
Notifications
You must be signed in to change notification settings - Fork 7.7k
Fix setup.py neuron-ls issue #2671
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM! Thanks for the work!
if _is_cuda(): | ||
with open(get_path("requirements.txt")) as f: | ||
requirements = f.read().strip().split("\n") | ||
elif _is_hip(): | ||
with open(get_path("requirements-rocm.txt")) as f: | ||
requirements = f.read().strip().split("\n") | ||
elif _is_neuron(): | ||
with open(get_path("requirements-neuron.txt")) as f: | ||
requirements = f.read().strip().split("\n") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Add an else
statement here and raise an error in it?
setup.py
Outdated
|
||
def _is_cuda() -> bool: | ||
return (torch.version.cuda is not None) and not _is_neuron() | ||
return shutil.which("neuron-ls") is not None |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just to let you know, I tried implementing just this line and it doesn't work on an AWS AMI as the neuron-ls package is installed, but if you're not on a neuron device then it still fails. What worked for me was to just change the errors caught on line 39 to include 'subprocess.CalledProcessError'.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks! updated my PR to reflect that
This should fix #2661