Skip to content

feat: support flash attention 2 in qwen2 vl vision blocks #1728

feat: support flash attention 2 in qwen2 vl vision blocks

feat: support flash attention 2 in qwen2 vl vision blocks #1728

Annotations

1 warning

build (cuda)  /  integration_tests

succeeded Nov 4, 2024 in 45m 37s