Skip to content

Commit

Permalink
Enable FAv3 by default 🎄🎅
Browse files Browse the repository at this point in the history
  • Loading branch information
danthe3rd authored Dec 18, 2024
1 parent 089f177 commit a00f10e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion xformers/ops/fmha/dispatch.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
T = TypeVar("T", Type[AttentionFwOpBase], Type[AttentionBwOpBase])


_USE_FLASH_ATTENTION_3 = False
_USE_FLASH_ATTENTION_3 = True


def _set_use_fa3(use_flash_attention3: bool) -> None:
Expand Down

0 comments on commit a00f10e

Please sign in to comment.