dev-python/litellm
Call all LLM APIs using the OpenAI format (OpenAI, Azure, Anthropic, Cohere, Groq, TogetherAI, HuggingFace, etc.)
-
litellm-1.81.3~amd64 ~arm64proxy server ui -extra python_targets_python3_12 python_targets_python3_13 python_targets_python3_14
View
Download
Browse License: MIT Overlay: ha-bleending-edge -
litellm-1.80.11~amd64 ~arm64proxy server ui -extra python_targets_python3_12 python_targets_python3_13 python_targets_python3_14
View
Download
Browse License: MIT Overlay: ha-bleending-edge -
litellm-1.80.7~amd64 ~arm64proxy server ui -extra python_targets_python3_12 python_targets_python3_13 python_targets_python3_14
View
Download
Browse License: MIT Overlay: ha-bleending-edge
USE Flags
proxy
Global: Enable proxy support
server
Global: Installs scripts to be used on the server-side of this app
ui
Global: Build the user interface (could be gtk or ncurses based, depending on sdl, dga, svga and aalib USE flags)
-extra
* This flag is undocumented *
python_targets_python3_12
* This flag is undocumented *
python_targets_python3_13
* This flag is undocumented *
python_targets_python3_14
* This flag is undocumented *

