sci-ml/exllamav2
Inference library for running local LLMs on consumer hardware.
-
exllamav2-0.3.2~amd64python_single_target_python3_11 python_single_target_python3_12 python_single_target_python3_13 debug
View
Download
Browse License: MIT
Overlay: tatsh-overlay
USE Flags
python_single_target_python3_11
* This flag is undocumented *
python_single_target_python3_12
* This flag is undocumented *
python_single_target_python3_13
* This flag is undocumented *
debug
Global: Enable extra debug codepaths, like asserts and extra output. If you want to get meaningful backtraces see http://www.gentoo.org/proj/en/qa/backtraces.xml