Explore Projects

Discover 17 open source projects

Active filters (1):
Search: knowledge-distillationร—
Clear all

Showing 1-17 of 17 projects

PaddlePaddle/PaddleClas

A comprehensive visual classification and recognition library powered by the PaddlePaddle deep learning framework.

5.8K
Stable
Python
Computer Vision
API Frameworks
Python
#image-classification#image-recognition#image-retrieval

dkozlov/awesome-knowledge-distillation

A collection of resources and tools related to knowledge distillation, a technique for compressing and transferring knowledge from a larger model to a smaller one.

3.8K
Stable
Knowledge Distillation
#knowledge-distillation#model-compression#deep-learning

huawei-noah/Pretrained-Language-Model

Pretrained language model and optimization techniques for large-scale distributed AI/ML development.

3.2K
Archived
Python
LLM Frameworks
Model Compression
Python
#pretrained-models#knowledge-distillation#large-scale-distributed

intel/neural-compressor

Optimizes large language models for low-bit precision and sparsity, improving model compression techniques.

2.6K
Active
Python
LLM Frameworks
PyTorch
#quantization#post-training-quantization#sparsity

alibaba/EasyNLP

EasyNLP is an easy-to-use NLP toolkit for tasks like text classification, retrieval, and generation.

2.2K
Archived
Python
LLM Frameworks
ML Ops
PyTorch
#natural-language-processing#text-classification#text-generation

haitongli/knowledge-distillation-pytorch

A PyTorch library for exploring deep and shallow knowledge distillation experiments with flexibility.

2.0K
Archived
Python
Computer Vision
Model Compression
PyTorch
#computer-vision#model-compression#knowledge-distillation

horseee/Awesome-Efficient-LLM

A curated list of efficient and compressed large language models for developers to explore.

2.0K
Experimental
Python
LLM Frameworks
LLM Compression
#compression#efficient-llm#knowledge-distillation

robert-mcdermott/ai-knowledge-graph

An AI-powered tool to generate knowledge graphs from text data, with visualization capabilities.

1.9K
Stable
Python
LLM Frameworks
API Frameworks
Python
#artificial-intelligence#knowledge-graph#llm

microsoft/Cream

A collection of Microsoft's work on NAS and Vision Transformer for efficient AI models.

1.8K
Archived
Python
Computer Vision
ML Ops
Python
#automl#efficiency#knowledge-distillation

AberHu/Knowledge-Distillation-Zoo

A PyTorch implementation of various Knowledge Distillation (KD) methods for model compression and transfer learning.

1.7K
Archived
Python
Model Compression
Knowledge Distillation
PyTorch
#distillation#knowledge-transfer#model-compression

open-mmlab/mmrazor

An open-source toolbox and benchmark for model compression and acceleration in PyTorch.

1.7K
Archived
Python
ML Ops
API Frameworks
PyTorch
#model-compression#model-acceleration#benchmark

yoshitomo-matsubara/torchdistill

A PyTorch-based framework for reproducible deep learning studies with 26 knowledge distillation methods.

1.6K
Stable
Python
ML Ops
Computer Vision
PyTorch
#deep-learning#computer-vision#natural-language-processing

szagoruyko/attention-transfer

Improves Convolutional Networks via Attention Transfer using PyTorch.

1.5K
Archived
Jupyter Notebook
PyTorch
#attention-transfer#deep-learning#knowledge-distillation

lxztju/pytorch_classification

A comprehensive PyTorch-based code repository for image classification, including training, prediction, TTA, model fusion, deployment, and more.

1.5K
Archived
Jupyter Notebook
Computer Vision
API Frameworks
PyTorch
#image-classification#cnn#resnet

microsoft/NeuronBlocks

A Python toolkit for building and fine-tuning deep learning models, including NLP and question answering.

1.5K
Archived
Python
Prompt Engineering
PyTorch
#natural-language-processing#question-answering#deep-learning

huawei-noah/Efficient-Computing

Efficient computing methods developed by Huawei Noah's Ark Lab for model compression and optimization.

1.3K
Archived
Jupyter Notebook
Model Compression
Quantization
#model-compression#quantization#pruning

Tebmer/Awesome-Knowledge-Distillation-of-LLMs

A comprehensive survey on knowledge distillation techniques for large language models.

1.3K
Experimental
LLM Frameworks
Tutorials & Courses
#knowledge-distillation#large-language-model#survey

Stay in the loop

Get weekly updates on trending AI coding tools and projects.