Skip to content

Inference: fix batch_size issue. #1870

Inference: fix batch_size issue.

Inference: fix batch_size issue. #1870

Triggered via pull request July 18, 2023 04:33
Status Cancelled
Total duration 8m 10s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

gpu-ci.yml

on: pull_request
GPU CI Concierge
12s
GPU CI Concierge
Inference Tests
0s
Inference Tests
Check Python Interface
0s
Check Python Interface
Single Machine, Multiple GPUs Tests
0s
Single Machine, Multiple GPUs Tests
Fit to window
Zoom out
Zoom in

Annotations

2 errors
Check Python Interface
Canceling since a higher priority waiting request for 'gpu-ci-fix_batch_size' exists
Inference Tests
Canceling since a higher priority waiting request for 'gpu-ci-fix_batch_size' exists