-
Notifications
You must be signed in to change notification settings - Fork 68
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[python] Update lmi-dist #975
Conversation
mpi4py sentencepiece einops accelerate==${accelerate_version} bitsandbytes==${bitsandbytes_version}\ | ||
diffusers[torch]==${diffusers_version} peft==${peft_version} opencv-contrib-python-headless safetensors scipy && \ | ||
scripts/install_flash_attn.sh && \ | ||
scripts/install_flash_attn_v2.sh && \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why we need both?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some machine don't support flash attention v2, so have to install both. Then it will determine which version to use at runtime.
@@ -16,6 +16,7 @@ ARG python_version=3.9 | |||
ARG torch_version=2.0.1 | |||
ARG torch_vision_version=0.15.2 | |||
ARG deepspeed_wheel="https://publish.djl.ai/deepspeed/deepspeed-nightly-py2.py3-none-any.whl" | |||
ARG vllm_wheel="https://publish.djl.ai/vllm/vllm-0.0.0-cp39-cp39-linux_x86_64.whl" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would check with @frankfliu , vllm introduced a lot of dependencies you don't want, that's why we didn't have it for V5. If you already understand this, please feel free to ignore
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I discussed with Frank about this yesterday. So this solution to install wheel will not install the dependencies.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Total size increase is 0.5GB.
c25f61f
to
ee213c1
Compare
* [python] Upgrade the dependency for lmi-dist * [python] Upgrade lmi-dist
* [python] Upgrade the dependency for lmi-dist * [python] Upgrade lmi-dist
Description
Brief description of what this PR is about