Showing 1-5 of 5 projects
Parameter-efficient fine-tuning for large models
A unified library for parameter-efficient and modular transfer learning in NLP with BERT, LoRA, and Transformers.
A unified platform for researchers to easily use and integrate various large language models and parameter-efficient techniques.
A lightweight and efficient virtual try-on diffusion model for fashion applications.
Code for a paper on parameter-efficient fine-tuning of large language models using adapter modules.
Get weekly updates on trending AI coding tools and projects.