sci-ml/flash-attn
Flash Attention: Fast and Memory-Efficient Exact Attention (Python component).
- 
				
					flash-attn-2.8.3~amd64cuda rocm python_single_target_python3_11 python_single_target_python3_12 python_single_target_python3_13 debug
  View View Download Download Browse     License: BSD   Overlay: tatsh-overlay Browse     License: BSD   Overlay: tatsh-overlay

