Skip to content

Issues: Dao-AILab/flash-attention

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

【HDIM=96】head dim = 96 ?
#1238 opened Sep 19, 2024 by SunNy820828449
fp8 not enabled for mha_varlen_fwd
#1232 opened Sep 16, 2024 by goldhuang
[BUG]2 tests failed...?
#1231 opened Sep 16, 2024 by ziyuhuang123
Additive Bias in Flash Attention
#1219 opened Sep 11, 2024 by kkh517
export onnx issue
#1216 opened Sep 10, 2024 by scuizhibin
ProTip! Add no:assignee to see everything that’s not assigned.