Skip to content

Update LLM Perf Benchmarks - CUDA PyTorch #124

Update LLM Perf Benchmarks - CUDA PyTorch

Update LLM Perf Benchmarks - CUDA PyTorch #124

Annotations

1 error and 2 warnings

run_benchmarks (gptq, 1xA10, aws-g5-4xlarge-plus)

failed Sep 25, 2024 in 20s