Skip to content

Use threads_per_warp=16 for 06-fused-attention.py #1596

Use threads_per_warp=16 for 06-fused-attention.py

Use threads_per_warp=16 for 06-fused-attention.py #1596

Pre-commit checks

succeeded May 17, 2024 in 1m 19s