-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Showing
1 changed file
with
2 additions
and
2 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,7 +1,7 @@ | ||
# Reversal_Curse | ||
This page contains three python notebooks, [Reversal_data_generation](https://github.com/WGLab/Reversal_Curse/blob/main/Reversal_data_generation.ipynb), [llama_union_intersectoin](https://github.com/WGLab/Reversal_Curse/blob/main/llama_union_intersection.ipynb) and [bert_reversal_curse](https://github.com/WGLab/Reversal_Curse/blob/main/bert_reversal_curse.ipynb), for the [paper](https://arxiv.org/abs/2312.03633) *Yang, Jingye, Da Wu, and Kai Wang. "Not All Large Language Models (LLMs) Succumb to the" Reversal Curse": A Comparative Study of Deductive Logical Reasoning in BERT and GPT Models." arXiv preprint arXiv:2312.03633 (2023).* | ||
|
||
Specifically, the [Reversal_data_generation](https://github.com/WGLab/Reversal_Curse/blob/main/Reversal_data_generation.ipynb) contains all the code for generating synthetic training and testing data in the paper. The [BERT_reversal](https://github.com/WGLab/Reversal_Curse/blob/main/BERT_reversal.ipynb) contains the code for all the training and evulation on BERT model, whereas [llama_union_intersectoin](https://github.com/WGLab/Reversal_Curse/blob/main/llama_union_intersection.ipynb) contains the code for all the training and evulation on LlaMA model. | ||
Specifically, the [Reversal_data_generation](https://github.com/WGLab/Reversal_Curse/blob/main/Reversal_data_generation.ipynb) contains all the code for generating synthetic training and testing data in the paper. The [bert_reversal_curse](https://github.com/WGLab/Reversal_Curse/blob/main/bert_reversal_curse.ipynb) contains the code for all the training and evaluation of the BERT model, and [llama_union_intersectoin](https://github.com/WGLab/Reversal_Curse/blob/main/llama_union_intersection.ipynb) contains the code for all the training and evulation of LlaMA model. These two scripts help reproduce the main results of the paper (Table 1 - 5). | ||
|
||
Furthermore, for those inclined, these code, especially the code for generating synthetic data, can be readily modified and employed for personalized testing and exploration. This flexibility allows interested individuals to tailor the code to their specific requirements and objectives. | ||
Furthermore, for those inclined, these codes, especially the code for generating synthetic data, can be readily modified and employed for personalized testing and exploration. This flexibility allows interested individuals to tailor the code to their specific requirements and objectives. | ||
|