Skip to content

Use threads_per_warp=16 for 06-fused-attention.py #1597

Use threads_per_warp=16 for 06-fused-attention.py

Use threads_per_warp=16 for 06-fused-attention.py #1597

Pre-commit checks

succeeded May 17, 2024 in 1m 26s