Showing 1-5 of 5 projects
A collection of resources and tools related to knowledge distillation, a technique for compressing and transferring knowledge from a larger model to a smaller one.
A C++11 header-only library for Nearest Neighbor search with KD-trees
A PyTorch library for exploring deep and shallow knowledge distillation experiments with flexibility.
A PyTorch implementation of various Knowledge Distillation (KD) methods for model compression and transfer learning.
A comprehensive survey on knowledge distillation techniques for large language models.
Get weekly updates on trending AI coding tools and projects.