Skip to content

Commit

Permalink
Add comment with regard to test_backward for bf16
Browse files Browse the repository at this point in the history
  • Loading branch information
qianfengz committed Dec 21, 2024
1 parent f80aa5a commit 8d3dad1
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions tests/test_mem_eff_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -672,6 +672,8 @@ def test_backward(
if op_bw == fmha.ck.BwOp:
op_fw = fmha.ck.FwOp
if dtype == torch.bfloat16:
## bfloat16 testing can be enabled by export ENABLE_HIP_FMHA_RTN_BF16_CONVERT=1 when
## building xformers and get accurate results
pytest.skip(
"CK Fmha backward for bfloat16 currently is not very accurate for some cases!"
)
Expand Down

0 comments on commit 8d3dad1

Please sign in to comment.