How to check the prompt sent to the LLM when we uses the ChatCompletion.create() method ? #2989
Unanswered
AntoninLeroy
asked this question in
Q&A
Replies: 1 comment
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I use Mixtral8x7b and I want to make sure that the prompt template is correct.
Is it possible ?
Beta Was this translation helpful? Give feedback.
All reactions