Showing 1-11 of 11 projects
Microsoft's research repo for large-scale self-supervised pre-training across tasks, languages, and modalities
An open-source framework for prompt-learning, a powerful technique for fine-tuning language models.
A pre-trained ALBERT model for self-supervised learning of Chinese language representations.
A benchmark for evaluating language understanding models and datasets for the Chinese language.
Pre-trained Chinese ELECTRA language model for natural language processing tasks.
Unified Training of Universal Time Series Forecasting Transformers for accurate time series forecasting.
An open-source framework for building knowledgeable large language models with fine-tuning capabilities.
A PyTorch implementation of a BERT-style pretraining method for convolutional networks, enabling more efficient self-supervised learning.
A novel neural operator called Involution that can be used for image classification, object detection, and other computer vision tasks.
A curated list of must-read papers on knowledge editing for large language models.
Official repository for the Uni-Mol Series Methods, a deep learning-based molecular modeling library.
Get weekly updates on trending AI coding tools and projects.