Skip to content

Update vllm backend to support offline and online serving modes #320

Update vllm backend to support offline and online serving modes

Update vllm backend to support offline and online serving modes #320

Triggered via pull request July 22, 2024 11:38
Status Success
Total duration 10m 16s
Artifacts
cli_cuda_tensorrt_llm_tests
8m 17s
cli_cuda_tensorrt_llm_tests
Fit to window
Zoom out
Zoom in

Annotations

1 warning
cli_cuda_tensorrt_llm_tests
The following actions uses Node.js version which is deprecated and will be forced to run on node20: actions/checkout@v3. For more info: https://github.blog/changelog/2024-03-07-github-actions-all-actions-will-run-on-node20-instead-of-node16-by-default/