You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your great idea.
I tried with calling OpenAI API and sent chunks into the conversation context.
Due to the token limit, I got this error:
"type": "InvalidRequestError",
"message": "This model's maximum context length is 4097 tokens. However, your messages resulted in 4415 tokens. Please reduce the length of the messages."
Any suggestion?
The text was updated successfully, but these errors were encountered:
Thanks for your great idea.
I tried with calling OpenAI API and sent chunks into the conversation context.
Due to the token limit, I got this error:
"type": "InvalidRequestError",
"message": "This model's maximum context length is 4097 tokens. However, your messages resulted in 4415 tokens. Please reduce the length of the messages."
Any suggestion?
The text was updated successfully, but these errors were encountered: