Skip to content

MediaBrain-SJTU/MoLA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters

This repository is an official PyTorch implementation of "Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters"

Download Dataset

Run

For example, you can run the following command for MoLA training.

python train.py --benchmark vlcs_resnet50 --balancer ourbase --arch lora_soft_router --lora_layer 1 2 3 --lora_rank 4 4 4 8

Citation

If you find MoLA useful for your research or development, please cite the following:

@article{zhou2024exploring,
  title={Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters},
  author={Zhou, Yuhang and Zhao, Zihua and Li, Haolin and Du, Siyuan and Yao, Jiangchao and Zhang, Ya and Wang, Yanfeng},
  journal={arXiv preprint arXiv:2406.09679},
  year={2024}
}

Acknowledgement

This repository is built based on LibMTL. We thank the authors for releasing their codes.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages