Skip to content

Actions: vllm-project/vllm

mypy

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
15,840 workflow runs
15,840 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

[Core] Support Lora lineage and base model metadata management (#6315)
mypy #15885: Commit 260d40b pushed by simon-mo
September 20, 2024 06:20 45s main
September 20, 2024 06:20 45s
[Core]: Support encode only models by Workflow Defined Engine
mypy #15884: Pull request #8452 synchronize by noooop
September 20, 2024 06:03 49s noooop:wde_encode_only
September 20, 2024 06:03 49s
[Core]: Support encode only models by Workflow Defined Engine
mypy #15883: Pull request #8452 synchronize by noooop
September 20, 2024 06:02 1m 0s noooop:wde_encode_only
September 20, 2024 06:02 1m 0s
[Bugfix] fix docker build for xpu
mypy #15880: Pull request #8652 opened by yma11
September 20, 2024 05:26 44s yma11:docker-fix
September 20, 2024 05:26 44s
[Frontend] Batch inference for llm.chat() API
mypy #15879: Pull request #8648 synchronize by aandyw
September 20, 2024 04:56 43s aandyw:batch-inf-llm-chat
September 20, 2024 04:56 43s
[Frontend] Batch inference for llm.chat() API
mypy #15875: Pull request #8648 synchronize by aandyw
September 20, 2024 03:57 42s aandyw:batch-inf-llm-chat
September 20, 2024 03:57 42s
[bugfix] [AMD] add multi-step advance_step to ROCmFlashAttentionMetad…
mypy #15874: Commit 9e5ec35 pushed by simon-mo
September 20, 2024 03:49 40s main
September 20, 2024 03:49 40s
[Frontend] Batch inference for llm.chat() API
mypy #15872: Pull request #8648 synchronize by aandyw
September 20, 2024 03:35 39s aandyw:batch-inf-llm-chat
September 20, 2024 03:35 39s
[Frontend] Batch inference for llm.chat() API
mypy #15871: Pull request #8648 synchronize by aandyw
September 20, 2024 03:32 40s aandyw:batch-inf-llm-chat
September 20, 2024 03:32 40s
[Frontend] Batch inference for llm.chat() API
mypy #15869: Pull request #8648 synchronize by aandyw
September 20, 2024 03:20 27s aandyw:batch-inf-llm-chat
September 20, 2024 03:20 27s
Adding Cascade Infer to FlashInfer
mypy #15868: Pull request #8132 synchronize by raywanb
September 20, 2024 03:10 33s raywanb:ray/cascade_infer
September 20, 2024 03:10 33s
[Frontend] Batch inference for llm.chat() API
mypy #15865: Pull request #8648 synchronize by aandyw
September 20, 2024 02:35 28s aandyw:batch-inf-llm-chat
September 20, 2024 02:35 28s
[MISC] add support custom_op check
mypy #15864: Pull request #8557 synchronize by jikunshang
September 20, 2024 02:22 44s jikunshang:check_custom_op
September 20, 2024 02:22 44s
[Frontend] Batch inference for llm.chat() API
mypy #15863: Pull request #8648 synchronize by aandyw
September 20, 2024 01:55 39s aandyw:batch-inf-llm-chat
September 20, 2024 01:55 39s
[Frontend] Batch inference for llm.chat() API
mypy #15862: Pull request #8648 opened by aandyw
September 20, 2024 01:54 44s aandyw:batch-inf-llm-chat
September 20, 2024 01:54 44s