Showing 1-5 of 5 projects
YAYI 2 is an open-source large language model (LLM) for Chinese developers, trained on over 20 trillion tokens.
Pretrained language model and optimization techniques for large-scale distributed AI/ML development.
Foundation Architecture for (M)LLMs, a powerful toolkit for building large language models.
A plug-and-play library for parameter-efficient-tuning (Delta Tuning) of pre-trained language models.
A repository containing research papers on text summarization, focused on language models and NLP.
Get weekly updates on trending AI coding tools and projects.