Showing 1-17 of 17 projects
An open-source implementation of the LoRA (Low-Rank Adaptation) technique for fine-tuning large language models.
Pre-trained Chinese BERT model with whole word masking, useful for natural language processing tasks.
BertViz is a Python library for visualizing attention in transformer models like BERT, GPT-2, and RoBERTa.
A curated collection of high-quality, pretrained Chinese NLP models for various tasks and applications.
CLUE is a comprehensive Chinese language understanding evaluation benchmark with datasets, baselines, pre-trained models, and a leaderboard.
A pre-trained ALBERT model for self-supervised learning of Chinese language representations.
Open-source pre-training model framework in PyTorch with a pre-trained model zoo for natural language processing tasks.
Political activism documentation on Chinese government censorship, human rights, and censorship circumvention techniques.
A pre-trained RoBERTa language model for Chinese NLP tasks.
news-please is an integrated web crawler and information extractor for news that works out of the box.
A powerful BERT-based language model for natural language understanding and representation learning.
A Vue-based chat application with real-time functionality
A powerful NLP library for transfer learning and building question answering systems.
A fast and user-friendly runtime for running transformer models like BERT, GPT-2, and others on CPU and GPU.
CLUENER2020 is a Chinese fine-grained named entity recognition dataset and benchmark for AI-powered NLP development.
A PyTorch implementation of BERT for sequence-to-sequence tasks, including text summarization, classification, and generation.
Tencent Pre-training framework in PyTorch with a pre-trained model zoo for NLP tasks
Get weekly updates on trending AI coding tools and projects.