Skip to content

Inference: fix batch_size issue. #1877

Inference: fix batch_size issue.

Inference: fix batch_size issue. #1877

Triggered via pull request July 19, 2023 03:18
Status Success
Total duration 2h 9m 59s
Artifacts 1
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

gpu-ci.yml

on: pull_request
GPU CI Concierge
15s
GPU CI Concierge
Inference Tests
50m 0s
Inference Tests
Check Python Interface
17m 39s
Check Python Interface
Single Machine, Multiple GPUs Tests
1h 1m
Single Machine, Multiple GPUs Tests
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
output Expired
2.5 KB