Skip to content

Latest commit

 

History

History
41 lines (25 loc) · 1.38 KB

README.md

File metadata and controls

41 lines (25 loc) · 1.38 KB

Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters

This repository is an official PyTorch implementation of "Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters"

Download Dataset

Run

For example, you can run the following command for MoLA training.

python train.py --benchmark vlcs_resnet50 --balancer ourbase --arch lora_soft_router --lora_layer 1 2 3 --lora_rank 4 4 4 8

Citation

If you find MoLA useful for your research or development, please cite the following:

@article{zhou2024exploring,
  title={Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters},
  author={Zhou, Yuhang and Zhao, Zihua and Li, Haolin and Du, Siyuan and Yao, Jiangchao and Zhang, Ya and Wang, Yanfeng},
  journal={arXiv preprint arXiv:2406.09679},
  year={2024}
}

Acknowledgement

This repository is built based on LibMTL. We thank the authors for releasing their codes.