Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[python] Update lmi-dist #975

Merged
merged 2 commits into from
Jul 27, 2023
Merged

[python] Update lmi-dist #975

merged 2 commits into from
Jul 27, 2023

Conversation

xyang16
Copy link
Contributor

@xyang16 xyang16 commented Jul 26, 2023

Description

Brief description of what this PR is about

  • If this change is a backward incompatible change, why must this change be made?
  • Interesting edge cases to note here

mpi4py sentencepiece einops accelerate==${accelerate_version} bitsandbytes==${bitsandbytes_version}\
diffusers[torch]==${diffusers_version} peft==${peft_version} opencv-contrib-python-headless safetensors scipy && \
scripts/install_flash_attn.sh && \
scripts/install_flash_attn_v2.sh && \
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why we need both?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some machine don't support flash attention v2, so have to install both. Then it will determine which version to use at runtime.

@@ -16,6 +16,7 @@ ARG python_version=3.9
ARG torch_version=2.0.1
ARG torch_vision_version=0.15.2
ARG deepspeed_wheel="https://publish.djl.ai/deepspeed/deepspeed-nightly-py2.py3-none-any.whl"
ARG vllm_wheel="https://publish.djl.ai/vllm/vllm-0.0.0-cp39-cp39-linux_x86_64.whl"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would check with @frankfliu , vllm introduced a lot of dependencies you don't want, that's why we didn't have it for V5. If you already understand this, please feel free to ignore

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I discussed with Frank about this yesterday. So this solution to install wheel will not install the dependencies.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Total size increase is 0.5GB.

@xyang16 xyang16 marked this pull request as ready for review July 26, 2023 17:03
@xyang16 xyang16 requested review from zachgk and a team as code owners July 26, 2023 17:03
@xyang16 xyang16 requested a review from maaquib July 26, 2023 17:03
@xyang16 xyang16 changed the title [python] Upgrade the dependency for lmi-dist [python] Update lmi-dist Jul 26, 2023
@xyang16 xyang16 force-pushed the update branch 2 times, most recently from c25f61f to ee213c1 Compare July 26, 2023 20:05
@xyang16 xyang16 merged commit 52f3eff into deepjavalibrary:master Jul 27, 2023
KexinFeng pushed a commit to KexinFeng/djl-serving-forked that referenced this pull request Aug 16, 2023
* [python] Upgrade the dependency for lmi-dist

* [python] Upgrade lmi-dist
KexinFeng pushed a commit to KexinFeng/djl-serving-forked that referenced this pull request Aug 16, 2023
* [python] Upgrade the dependency for lmi-dist

* [python] Upgrade lmi-dist
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants