Showing 1-17 of 17 projects
A comprehensive visual classification and recognition library powered by the PaddlePaddle deep learning framework.
A collection of resources and tools related to knowledge distillation, a technique for compressing and transferring knowledge from a larger model to a smaller one.
Pretrained language model and optimization techniques for large-scale distributed AI/ML development.
Optimizes large language models for low-bit precision and sparsity, improving model compression techniques.
EasyNLP is an easy-to-use NLP toolkit for tasks like text classification, retrieval, and generation.
A PyTorch library for exploring deep and shallow knowledge distillation experiments with flexibility.
A curated list of efficient and compressed large language models for developers to explore.
An AI-powered tool to generate knowledge graphs from text data, with visualization capabilities.
A collection of Microsoft's work on NAS and Vision Transformer for efficient AI models.
A PyTorch implementation of various Knowledge Distillation (KD) methods for model compression and transfer learning.
An open-source toolbox and benchmark for model compression and acceleration in PyTorch.
A PyTorch-based framework for reproducible deep learning studies with 26 knowledge distillation methods.
Improves Convolutional Networks via Attention Transfer using PyTorch.
A comprehensive PyTorch-based code repository for image classification, including training, prediction, TTA, model fusion, deployment, and more.
A Python toolkit for building and fine-tuning deep learning models, including NLP and question answering.
Efficient computing methods developed by Huawei Noah's Ark Lab for model compression and optimization.
A comprehensive survey on knowledge distillation techniques for large language models.
Get weekly updates on trending AI coding tools and projects.