Showing 61-80 of 110 projects
Comprehensive resources for in-context learning and prompt engineering for large language models like ChatGPT and GPT-3.
A self-supervised video representation learning model for video understanding tasks.
jiant is an NLP toolkit that provides pre-trained BERT models and tools for multi-task learning and transfer learning.
A pre-trained Chinese XLNet model for natural language processing tasks.
A collection of pre-trained models for the dlib computer vision and machine learning library.
Pre-trained neural network models in Elixir's Axon framework, with integration for Hugging Face models.
A pre-trained Chinese language model for text generation, useful for AI-powered coding and content creation.
A PyTorch implementation of OpenAI's pre-trained transformer language model for fine-tuning and inference.
A library of pre-trained LSTM models for optical character recognition (OCR) tasks.
An open-source large language model (LLM) for AI-powered coding and developer discovery tools.
Pre-trained Chinese ELECTRA language model for natural language processing tasks.
A library for self-supervised learning on graphs, providing contrastive, generative, and predictive pretext tasks.
Unified-Modal Speech-Text Pre-Training for Spoken Language Processing
A PyTorch library for easy access to pre-trained models and model components for developers.
Unified Training of Universal Time Series Forecasting Transformers for accurate time series forecasting.
A Python library and tools for generating and inspecting data for pre-training large language models (LLMs).
Use pre-trained transformer models like BERT, GPT-2, and XLNet in the spaCy NLP library.
Korean BERT pre-trained model for natural language processing tasks in the Korean language.
An open-source framework for building knowledgeable large language models with fine-tuning capabilities.
A collection of pre-trained computer vision models for developers building AI-powered applications.
Get weekly updates on trending AI coding tools and projects.