Showing 101-110 of 110 projects
A data repository for pre-trained NLP models and corpora to use in language processing projects.
A toolbox for open-world and high-quality image segmentation, with pre-trained models and PyTorch-based implementation.
A plug-and-play library for parameter-efficient-tuning (Delta Tuning) of pre-trained language models.
An open-source Python library for open-vocabulary segmentation with a pre-trained Segment Anything Model (SAM).
A Python project that generates realistic-sounding text using a pre-trained GPT-2 model, useful for AI coding tools and language generation.
Fast and simple repository for pre-training and fine-tuning T5-style language models.
KoGPT is a Korean language model based on GPT that can be used for natural language processing tasks.
Open-source repository of pre-trained language models for various AI and NLP applications.
A library for building Mixture-of-Experts (MoE) models from the LLaMA language model with continual pre-training.
High-level network definitions with pre-trained weights in TensorFlow for deep learning and computer vision tasks.
Get weekly updates on trending AI coding tools and projects.