Showing 1-2 of 2 projects
A collection of resources and tools related to knowledge distillation, a technique for compressing and transferring knowledge from a larger model to a smaller one.
A PyTorch implementation of various Knowledge Distillation (KD) methods for model compression and transfer learning.
Get weekly updates on trending AI coding tools and projects.