Skip to content

Latest commit

 

History

History
56 lines (36 loc) · 8.27 KB

README.md

File metadata and controls

56 lines (36 loc) · 8.27 KB

Paradigm Shift in NLP

Welcome to the webpage for "Paradigm Shift in Natural Language Processing". Some resources of the paper are constantly maintained here, such as a full list of papers of paradigm shift, an interactive Sankey diagram to depict the trend of paradigm shift, etc.

What is paradigm shift?

First of all, what is paradigm, and what is paradigm shift?

Paradigm is the general framework to model a class of tasks. For example, sequence labeling (SeqLab) is a popular paradigm to solve named entity recognition (NER). We summarize the mainstream paradigms that are widely used for common NLP tasks as: Class, Matching, SeqLab, MRC, Seq2Seq, Seq2ASeq, (M)LM.

Paradigm shift is a phenomena of solving a task that is usually solved with some paradigm with another paradigm. For example, Li et al. (2020) uses the MRC paradigm to solve NER, which is previously solved with SeqLab, then we can say that the paradigm of NER shifted from SeqLab to MRC.

The figure below shows the observed shift (or transfer) of the seven paradigms in recent years.

Paradigm shift in NLP tasks

We collect the papers of paradigm shift in the table below, which is an extension of the Table 1 in our original paper. This table will be constantly updated.

Task Class Matching SeqLab MRC Seq2Seq Seq2ASeq (M)LM
TC Kim 2014;
Liu et al. 2016;
Devlin et al. 2019
Chai et al. 2020;
Yin et al. 2020;
Wang et al. 2021;
Yang et al. 2018 Brown et al. 2020;
Schick&Schutze 2021;
Schick&Schutze 2021;
Gao et al. 2021
NLI Devlin et al. 2019 Chen et al. 2017 McCann et al. 2018 Schick&Schutze 2021;
Schick&Schutze 2021;
Gao et al. 2021
NER Xia et al. 2019;
Fisher&Vlachos 2019;
Yu et al. 2020;
Fu et al. 2021
Ma&Hovy 2016;
Lample 2016
Li et al. 2020 Yan et al. 2021 Lample et al. 2016;
Dai et al. 2020
Ma et al. 2021
ABSA Wang et al. 2016 Sun et al. 2019 Mao et al. 2021
Chen et al. 2021
Yan et al. 2021;
Zhang et al. 2021
Li et al. 2021
RE Zeng et al. 2014 Levy et al. 2017;
Li et al. 2019;
Zhao et al. 2020
Zeng et al. 2018 Han et al. 2021
Summ Zhong et al. 2020 Cheng&Lapata 2016 Nallapati et al. 2016 Aghajanyan et al. 2021
Parsing Rodríguez&Vilares 2018;
Strzyz et al. 2019;
Vilares&Rodríguez 2020;
Vacareanu et al. 2020;
Gan et al. 2021 Vinyals et al. 2015;
Li et al. 2018;
Rongali et al. 2020
Chen et al. 2014;
Dyer et al. 2015;
Choe&Charniak 2016

Trends

To intuitively depict the trend of paradigm shift in NLP, we also draw an interactive Sankey diagram, which is an extension of the Figure 2 in our original paper. Also, this diagram is constantly updated as the table above changed.

Contributing

This line of research is difficult to be comprehensively surveyed, so welcome any additions, modifications, and suggestions! Please feel free to submit pull request or directly contact me.

Citation

If you find this webpage or the paper helpful to your research, please cite our paper:

@article{Sun2022Paradigm,
  author    = {Tianxiang Sun and Xiangyang Liu and Xipeng Qiu and Xuanjing Huang},
  title     = {Paradigm Shift in Natural Language Processing},
  journal   = {Machine Intelligence Research},
  year      = {2022},
  volume    = {19},
  pages     = {169--183},
  url       = {https://doi.org/10.1007/s11633-022-1331-6},
}