Skip to content

Latest commit

 

History

History
32 lines (22 loc) · 1.54 KB

README.md

File metadata and controls

32 lines (22 loc) · 1.54 KB

NAACL 2024: Mitigating Language-Level Performance Disparity in mPLMs via Teacher Language Selection and Cross-Lingual Self-Distillation

Welcome to the official implementation of the paper "Mitigating Language-Level Performance Disparity in mPLMs via Teacher Language Selection and Cross-Lingual Self-Distillation" presented at NAACL 2024.

Overview

This repository demonstrates our proposed approach for addressing the language-level performance disparity issue in multilingual Pretrained Language Models (mPLMs). We introduce the ALSACE that comprises of Teacher Language Selection and Cross-lingual Self-Distillation to mitigate this issue.

Features

With this code, you can:

  • Train the mPLM with the ALSACE method
  • Evaluate the language-level performance disparity of the mPLM

Requirements

The requirements to use this code include Python 3.6+, PyTorch 1.0+, and other common packages listed in requirements.txt.

Training

For training the model, please follow the following instructions:(will be available soon).

Evaluation

To evaluate the language-level performance disparity of any mPLM, please refer to the following instruction and scripts:(will be available soon).

Citation

If you use our work, please cite our paper (BibTex citation will be available soon).

Contacts

For any question or suggestion, you can submit an issue in this repository.

License

Apache License 2.0

Disclaimer

This is an official implementation yet it may not be free from bugs. Please use it responsibly.