Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error on maximum context for GPT request and ask for another goal :/ #525

Closed
anegoda1995 opened this issue Dec 3, 2023 · 0 comments · Fixed by #640
Closed

Error on maximum context for GPT request and ask for another goal :/ #525

anegoda1995 opened this issue Dec 3, 2023 · 0 comments · Fixed by #640

Comments

@anegoda1995
Copy link

anegoda1995 commented Dec 3, 2023

Describe the bug
When I post a link to the issue on GitHub and ask the agent to analyze it, it returns me the error

To Reproduce
Steps to reproduce the behavior:

  1. Ask the agent "Look at this issue and write me what I can do to fix it https://github.com/RetroAchievements/RAWeb/issues/1863"
  2. Wait for hist step3-5 to throw an error with: 400 This model's maximum context length is 16385 tokens. However, your messages resulted in 24027 tokens. Please reduce the length of the messages.

Expected behavior
It will split the information and send it with 2 or more requests for achieving the goal.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants