Skip to content

Actions: iotamudelta/vllm

yapf

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
8 workflow runs
8 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

[Model] [BUG] Fix code path logic to load mllama model (#234)
yapf #8: Commit 1658370 pushed by iotamudelta
October 21, 2024 18:07 2m 41s main
October 21, 2024 18:07 2m 41s
llama3.2 + cross attn test (#220)
yapf #7: Commit 2550f14 pushed by iotamudelta
October 8, 2024 14:48 2m 24s main
October 8, 2024 14:48 2m 24s
Fixing P3L incompatibility with cython. (#200)
yapf #6: Commit bae9170 pushed by iotamudelta
September 23, 2024 16:37 2m 10s main
September 23, 2024 16:37 2m 10s
Nccl env for performance (#152)
yapf #5: Commit 68db66a pushed by iotamudelta
August 27, 2024 14:33 1m 28s main
August 27, 2024 14:33 1m 28s
Update test-template.j2 (#145)
yapf #4: Commit 7c5fd50 pushed by iotamudelta
August 20, 2024 16:38 1m 25s main
August 20, 2024 16:38 1m 25s
Make CAR ROCm 6.1 compatible. (#137)
yapf #3: Commit 4d2dda6 pushed by iotamudelta
August 15, 2024 15:21 1m 51s main
August 15, 2024 15:21 1m 51s
August 14, 2024 22:36 1m 19s
save shape when fp8 solution not found (#123)
yapf #1: Commit 8608888 pushed by iotamudelta
August 9, 2024 02:46 1m 18s main
August 9, 2024 02:46 1m 18s