How to make it remember the past conversation? #137
-
Hey thanks for making this, it makes the process of building LLM-based application so much easier for absolute beginner like me. I've been playing with it for a while, and noticing it doesn't have capability to remember the past history conversation, on top of that it has short length response. How to make it to remember the past history conversation, and make the response much longer and useful? I noticed in the config.yaml file that you can specify more than two tasks, can we combine it so that when a request has been made it will able to execute more than 2 tasks? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hello @capsaicinoids, We're thrilled to hear that our Regarding your queries:
Please refer to our updated # config.yaml
llm_params:
max_tokens: 1024 # Increase this value for longer outputs
|
Beta Was this translation helpful? Give feedback.
Hello @capsaicinoids,
We're thrilled to hear that our
autollm
package has been helpful to you as you explore building LLM-based applications! Your feedback is invaluable to us, especially coming from users who are just beginning their journey into this exciting field.Regarding your queries:
Conversation Memory:
At the moment,
autollm
is designed to handle stateless interactions, which means it doesn't inherently remember past interactions or the history of the conversation. This stateless design is intentional to simplify the model's operations for single-query scenarios, which are common in many applications. If you require a conversational memory feature, this would indeed be a subst…