dev-python/olive-ai
Olive: Simplify ML Model Finetuning, Conversion, Quantization, and Optimization for CPUs, GPUs and NPUs. [wheel]
-
olive-ai-0.11.0~amd64 ~x86aimet-onnx auto-opt azureml bnb capture-onnx-graph cpu diffusers directml docker finetune flash-attn gpu inc lora nvmo openvino optimum qnn shared-cache tf torch-tensorrt tune-session-params python_targets_python3_11 python_targets_python3_12 python_targets_python3_13 python_targets_python3_14
View
Download
Browse License: MIT Overlay: pypi
USE Flags
aimet-onnx
* This flag is undocumented *
auto-opt
* This flag is undocumented *
azureml
* This flag is undocumented *
bnb
* This flag is undocumented *
capture-onnx-graph
* This flag is undocumented *
cpu
* This flag is undocumented *
diffusers
* This flag is undocumented *
directml
* This flag is undocumented *
docker
* This flag is undocumented *
finetune
* This flag is undocumented *
flash-attn
* This flag is undocumented *
gpu
Global: GPU support for Enblend
inc
* This flag is undocumented *
lora
* This flag is undocumented *
nvmo
* This flag is undocumented *
openvino
* This flag is undocumented *
optimum
* This flag is undocumented *
qnn
* This flag is undocumented *
shared-cache
* This flag is undocumented *
tf
* This flag is undocumented *
torch-tensorrt
* This flag is undocumented *
tune-session-params
* This flag is undocumented *
python_targets_python3_11
* This flag is undocumented *
python_targets_python3_12
* This flag is undocumented *
python_targets_python3_13
* This flag is undocumented *
python_targets_python3_14
* This flag is undocumented *

