You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was wondering whether there is a way to adjust number of summaries paperQA outputs. It would seems that the number is always five blocks of text with references regardless of how I change the following parameters: --answer.answer_max_sources 20 --answer.answer_length 1500 --answer.evidence_summary_length 500 --answer.evidence_k 25
In addition answers are usually always about one page long regardless of the data found. I have tried to increase it by adjusting the above parameters but with no help. Does paperQA have some token limit that the GPT4 API output has and can it be adjusted? I found reference to token limits when using local LLM's but could not adjust it for the OpenAI API.
If these can be adjusted, would you be kind and provide examples when using command line? "!pqa --llm="gpt-4o-mini" --summary_llm="gpt-4o-mini""
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Thanks for the great tool.
I was wondering whether there is a way to adjust number of summaries paperQA outputs. It would seems that the number is always five blocks of text with references regardless of how I change the following parameters: --answer.answer_max_sources 20 --answer.answer_length 1500 --answer.evidence_summary_length 500 --answer.evidence_k 25
In addition answers are usually always about one page long regardless of the data found. I have tried to increase it by adjusting the above parameters but with no help. Does paperQA have some token limit that the GPT4 API output has and can it be adjusted? I found reference to token limits when using local LLM's but could not adjust it for the OpenAI API.
If these can be adjusted, would you be kind and provide examples when using command line? "!pqa --llm="gpt-4o-mini" --summary_llm="gpt-4o-mini""
Beta Was this translation helpful? Give feedback.
All reactions