This repository is an official PyTorch implementation of "Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters"
For example, you can run the following command for MoLA training.
python train.py --benchmark vlcs_resnet50 --balancer ourbase --arch lora_soft_router --lora_layer 1 2 3 --lora_rank 4 4 4 8
If you find MoLA
useful for your research or development, please cite the following:
@article{zhou2024exploring,
title={Exploring Training on Heterogeneous Data with Mixture of Low-rank Adapters},
author={Zhou, Yuhang and Zhao, Zihua and Li, Haolin and Du, Siyuan and Yao, Jiangchao and Zhang, Ya and Wang, Yanfeng},
journal={arXiv preprint arXiv:2406.09679},
year={2024}
}
This repository is built based on LibMTL. We thank the authors for releasing their codes.