dev-python/torchdistill
A Modular, Configuration-Driven Framework for Knowledge Distillation. Trained models, training logs and configurations are available for ensuring the reproducibility.
Reverse Dependencies
Reverse dependancies are sometimes conditional based on your USE flags, Ebuild version and sometimes other packages. please keep this in mind.


View
Download
Browse