sci-ml/flash-attn
Flash Attention: Fast and Memory-Efficient Exact Attention (Python component).
-
flash-attn-2.8.3~amd64cuda rocm python_single_target_python3_11 python_single_target_python3_12 python_single_target_python3_13 debug
View
Download
Browse License: BSD
Overlay: tatsh-overlay
Bugs
Bug # | Severity | Platform | Status | Description |
---|
These bugs were grabbed from http://bugs.gentoo.org and have only passed a preliminary search using the package title,
to do a more through search please visit: http://bugs.gentoo.org.