sci-ml/exllamav3
Inference library for running local LLMs on consumer hardware.
- 
				
					exllamav3-0.0.11~amd64python_single_target_python3_11 python_single_target_python3_12 python_single_target_python3_13 debug
  View View Download Download Browse     License: MIT   Overlay: tatsh-overlay Browse     License: MIT   Overlay: tatsh-overlay

