Skip to content
This repository has been archived by the owner on Oct 11, 2024. It is now read-only.

Commit

Permalink
rename for NIGHTLY
Browse files Browse the repository at this point in the history
  • Loading branch information
andy-neuma committed May 16, 2024
1 parent 1c97968 commit 0aa702d
Show file tree
Hide file tree
Showing 2 changed files with 31 additions and 0 deletions.
17 changes: 17 additions & 0 deletions .github/actions/nm-rename-for-nightly-whl/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
name: rename for nightly
description: 'renames nm-vllm as nightly'
inputs:
package_name:
description: "'package base name, e.g. 'nm-vllm'"
required: true
runs:
using: composite
steps:
- run: |
PACKAGE_NAME=${{ inputs.package_name }}
sed -i "s/name=\"${PACKAGE_NAME}\"/name=\"${PACKAGE_NAME}-nightly\"/g" setup.py
VERSION=$(grep "__version__" vllm/__init__.py | sed -e "s/__version__ = //g" -e "s/,//g" | sed -e "s/\"//g")
echo ${VERSION}
DATE=$(date +%Y%m%d)
sed -i "s/__version__ = \"${VERSION}\"/__version__ = \"${VERSION}.${DATE}\"/g" vllm/__init__.py
shell: bash
14 changes: 14 additions & 0 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,10 @@ on:
# makes workflow reusable
workflow_call:
inputs:
wf_category:
description: "categories: REMOTE, NIGHTLY, RELEASE"
type: string
default: "REMOTE"
build_label:
description: "requested runner label (specifies instance)"
type: string
Expand Down Expand Up @@ -31,6 +35,10 @@ on:
# makes workflow manually callable
workflow_dispatch:
inputs:
wf_category:
description: "categories: REMOTE, NIGHTLY, RELEASE"
type: string
default: "REMOTE"
build_label:
description: "requested runner label (specifies instance)"
type: string
Expand Down Expand Up @@ -103,6 +111,12 @@ jobs:
testmo_token: ${{ secrets.TESTMO_TEST_TOKEN }}
source: 'build-test'

- name: rename for nightly
uses: ./.github/actions/nm-rename-for-nightly-whl/
if: contains(fromJSON('["NIGHTLY"]'), inputs.wf_category)
with:
package_name: "nm-vllm"

- name: build
id: build
uses: ./.github/actions/nm-build-vllm/
Expand Down

1 comment on commit 0aa702d

@github-actions
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

bigger_is_better

Benchmark suite Current: 0aa702d Previous: 6334dd3 Ratio
{"name": "request_throughput", "description": "VLLM Engine throughput - synthetic\nmodel - NousResearch/Llama-2-7b-chat-hf\nmax_model_len - 4096\nbenchmark_throughput {\n \"use-all-available-gpus_\": \"\",\n \"input-len\": 256,\n \"output-len\": 128,\n \"num-prompts\": 1000\n}", "gpu_description": "NVIDIA A10G x 1", "vllm_version": "0.3.0", "python_version": "3.10.12 (main, May 10 2024, 13:42:25) [GCC 9.4.0]", "torch_version": "2.3.0+cu121"} 3.8390743043201665 prompts/s 3.8418198063652103 prompts/s 1.00
{"name": "token_throughput", "description": "VLLM Engine throughput - synthetic\nmodel - NousResearch/Llama-2-7b-chat-hf\nmax_model_len - 4096\nbenchmark_throughput {\n \"use-all-available-gpus_\": \"\",\n \"input-len\": 256,\n \"output-len\": 128,\n \"num-prompts\": 1000\n}", "gpu_description": "NVIDIA A10G x 1", "vllm_version": "0.3.0", "python_version": "3.10.12 (main, May 10 2024, 13:42:25) [GCC 9.4.0]", "torch_version": "2.3.0+cu121"} 1474.204532858944 tokens/s 1475.2588056442407 tokens/s 1.00

This comment was automatically generated by workflow using github-action-benchmark.

Please sign in to comment.