Showing 161-180 of 657 projects
A collection of pre-trained deep learning models and demos optimized for high performance using the OpenVINO toolkit.
MiniOB is a lightweight database that helps developers learn the fundamentals of database systems.
Must-read papers on prompt-based tuning for pre-trained language models.
CLUE is a comprehensive Chinese language understanding evaluation benchmark with datasets, baselines, pre-trained models, and a leaderboard.
An efficient, scalable RL training framework for reasoning and search engine calling interleaved with large language models.
A unified and scalable ML library for large-scale distributed training, model serving, and federated learning.
A pre-trained ALBERT model for self-supervised learning of Chinese language representations.
A Jupyter Notebook project that helps developers label their own data and train custom AI models.
RAFT is a PyTorch library for training and evaluating visual transformers, a popular AI model for computer vision tasks.
Open-source training materials and code samples for learning Docker, containers, and Kubernetes.
TorchGeo is a Python library for working with geospatial data using PyTorch, providing datasets, samplers, transforms, and pre-trained models.
Transfer Learning Library for Domain Adaptation, Task Adaptation, and Domain Generalization
A modular and easy-to-use Python library for training and using state-of-the-art retrieval models like ColBERT in RAG pipelines.
A high-performance deep learning framework for training and deploying neural networks on various hardware.
List of satellite image training datasets with annotations for computer vision and deep learning
Lasagne is a lightweight library for building and training neural networks in Theano, a powerful Python library for machine learning.
A pre-training toolbox and benchmark for vision AI models, including self-supervised learning and state-of-the-art architectures.
A simple reinforcement learning training library for reasoning tasks.
A collection of resources and tools related to knowledge distillation, a technique for compressing and transferring knowledge from a larger model to a smaller one.
TimeGPT-1 is a production-ready pre-trained time series foundation model for forecasting and anomaly detection.
Get weekly updates on trending AI coding tools and projects.