Skip to content
This repository has been archived by the owner on Jan 5, 2025. It is now read-only.

Commit

Permalink
minor enhancements
Browse files Browse the repository at this point in the history
  • Loading branch information
codebanesr committed Dec 15, 2023
1 parent 85a8871 commit 6b13f79
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 6 deletions.
5 changes: 1 addition & 4 deletions llm-server/routes/workflow/extractors/extract_body.py
Original file line number Diff line number Diff line change
Expand Up @@ -41,10 +41,7 @@ async def gen_body_from_schema(
HumanMessage(content="prev api responses: {}".format(prev_api_response)),
HumanMessage(content="current_state: {}".format(current_state)),
HumanMessage(
content="If the user is asking to generate values for some fields, likes product descriptions, jokes etc add them."
),
HumanMessage(
content="Given the provided information, generate the appropriate minified JSON payload to use as body for the API request. If a user doesn't provide a required parameter, use sensible defaults for required params, and leave optional params."
content="Generate the compact JSON payload for the API request based on the provided information, without adding commentary. If a user fails to provide a necessary parameter, default values for required parameters will be used, while optional parameters will be left unchanged."
),
]

Expand Down
2 changes: 1 addition & 1 deletion llm-server/routes/workflow/extractors/extract_param.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ async def gen_params_from_schema(
HumanMessage(
content="Based on the information provided, construct a valid parameter object to be used with python requests library. In cases where user input doesnot contain information for a query, DO NOT add that specific query parameter to the output. If a user doesn't provide a required parameter, use sensible defaults for required params, and leave optional params."
),
HumanMessage(content="Your output must be a valid json, without any commentry"),
HumanMessage(content="Your output must be a valid json, without any commentary"),
]
result = chat(messages)
logger.info("[OpenCopilot] LLM Body Response: {}".format(result.content))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ def process_conversation_step(
"bot_message": "your response based on the instructions provided at the beginning, this could also be clarification if the information provided by the user is not complete / accurate",
}
Don't add operation ids if you can reply by merely looking in the conversation history.
Don't add operation ids if you can reply by merely looking in the conversation history, also don't add any commentary.
"""
)
)
Expand Down

0 comments on commit 6b13f79

Please sign in to comment.