Showing 1-14 of 14 projects
Comprehensive Chinese NLP resource collection for developers
BertViz is a Python library for visualizing attention in transformer models like BERT, GPT-2, and RoBERTa.
A Chinese version of the GPT-2 language model training code, using the BERT tokenizer.
A Chinese NLP solution with large models, data, training, and inference capabilities for developers.
A pre-trained RoBERTa language model for Chinese NLP tasks.
Kashgari is a production-level NLP Transfer learning framework built on top of tf.keras for text-labeling and text-classification.
GPT-2 multi-language pre-trained models, including a 1.5B parameter Chinese model, for text generation and NLP tasks.
A fast and user-friendly runtime for running transformer models like BERT, GPT-2, and others on CPU and GPU.
A Python implementation of training for GPT2, supporting TPUs for AI-focused developers.
A Python library for detecting and analyzing comparisons to ChatGPT in text, with a corpus of human-written comparisons.
A PyTorch implementation of BERT for sequence-to-sequence tasks, including text summarization, classification, and generation.
A Python library that generates Mario levels using a large language model (GPT-2).
A Chinese GPT2 project for generating news article titles with detailed annotations.
Simple Pytorch implementation of OpenAI's GPT-2 language model for text generation.
Get weekly updates on trending AI coding tools and projects.