Explore Projects

Discover 2 open source projects

Active filters (1):
Search: model-distillationร—
Clear all

Showing 1-2 of 2 projects

dkozlov/awesome-knowledge-distillation

A collection of resources and tools related to knowledge distillation, a technique for compressing and transferring knowledge from a larger model to a smaller one.

3.8K
Stable
Knowledge Distillation
#knowledge-distillation#model-compression#deep-learning

autodistill/autodistill

An AI-powered tool for training supervised models without manual labeling, using foundation models and multimodal learning.

2.6K
Experimental
Python
Computer Vision
Model Distillation
PyTorch
#auto-labeling#computer-vision#foundation-models

Stay in the loop

Get weekly updates on trending AI coding tools and projects.