Showing 1-2 of 2 projects
A collection of resources and tools related to knowledge distillation, a technique for compressing and transferring knowledge from a larger model to a smaller one.
An AI-powered tool for training supervised models without manual labeling, using foundation models and multimodal learning.
Get weekly updates on trending AI coding tools and projects.