Explore Projects

Discover 2 open source projects

Active filters (1):
Search: knowledge-transferร—
Clear all

Showing 1-2 of 2 projects

dkozlov/awesome-knowledge-distillation

A collection of resources and tools related to knowledge distillation, a technique for compressing and transferring knowledge from a larger model to a smaller one.

3.8K
Stable
Knowledge Distillation
#knowledge-distillation#model-compression#deep-learning

AberHu/Knowledge-Distillation-Zoo

A PyTorch implementation of various Knowledge Distillation (KD) methods for model compression and transfer learning.

1.7K
Archived
Python
Model Compression
Knowledge Distillation
PyTorch
#distillation#knowledge-transfer#model-compression

Stay in the loop

Get weekly updates on trending AI coding tools and projects.