Skip to content

Inference: fix batch_size issue. #1871

Inference: fix batch_size issue.

Inference: fix batch_size issue. #1871

Triggered via pull request July 18, 2023 04:41
Status Success
Total duration 2h 18m 59s
Artifacts 1
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

gpu-ci.yml

on: pull_request
GPU CI Concierge
13s
GPU CI Concierge
Check Python Interface
17m 55s
Check Python Interface
Single Machine, Multiple GPUs Tests
1h 1m
Single Machine, Multiple GPUs Tests
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
output Expired
2.49 KB