Skip to content

Inference: fix batch_size issue. #1880

Inference: fix batch_size issue.

Inference: fix batch_size issue. #1880

Re-run triggered July 20, 2023 03:00
Status Success
Total duration 2h 31m 55s
Artifacts 1
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

gpu-ci.yml

on: pull_request
GPU CI Concierge
14s
GPU CI Concierge
Inference Tests
50m 1s
Inference Tests
Check Python Interface
17m 23s
Check Python Interface
Single Machine, Multiple GPUs Tests
1h 1m
Single Machine, Multiple GPUs Tests
Fit to window
Zoom out
Zoom in

Artifacts

Produced during runtime
Name Size
output Expired
2.49 KB