Interactions with billion-scale large language models typically yield long-form responses due to their extensive parametric capacities, along with retrieval-augmented features. While detailed responses provide insightful viewpoint of a specific subject, they frequently generate redundant and less engaging content that does not meet user interests. In this work, we focus on the role of query outlining (i.e., selected sequence of queries) in scenarios that users request a specific range of information, namely coverage-conditioned (
$C^2$ ) scenarios. For simulating$C^2$ scenarios, we construct QTree, 10K sets of information-seeking queries decomposed with various perspectives on certain topics. By utilizing QTree, we train QPlanner, a 7B language model generating customized query outlines that follow coverage-conditioned queries. We analyze the effectiveness of generated outlines through automatic and human evaluation, targeting on retrieval-augmented generation (RAG). Moreover, the experimental results demonstrate that QPlanner with alignment training can further provide outlines satisfying diverse user interests.
- # of dataset: 10,580 [LINK]
- Note: There are three more samples than those specified in the paper.
- Configuration
-
question
: Base query ($q_{base}$ ) -
instruction
: Coverage query ($q_{cov}$ ) -
background
: Background query -
intention
: Intent operation (include/exclude) -
tree
: QTree (a hierarchical set of queries) -
candidates
: Three candidate query outlines (i.e., four subqueries from QTree) extracted by LLM
-
- # of dataset: 300 [LINK]
- Configuration
-
question
: Base query ($q_{base}$ ) -
instruction
: Coverage query ($q_{cov}$ ) -
background
: Background query -
intention
: Intent operation (include/exclude) -
tree
: QTree (a hierarchical set of queries)
-
- Our QTree is based on seed queries from ASQA, Longform, and ExpertQA.
- We appreciate 🤗alignment-handbook for providing easy LM training framework!
@misc{kim2024learning,
title={Learning to Explore and Select for Coverage-Conditioned Retrieval-Augmented Generation},
author={Takyoung Kim and Kyungjae Lee and Young Rok Jang and Ji Yong Cho and Gangwoo Kim and Minseok Cho and Moontae Lee},
year={2024},
eprint={2407.01158},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2407.01158},
}