Skip to content

Commit

Permalink
fix: correct model type extraction for O1 model handling in litellm_a…
Browse files Browse the repository at this point in the history
…i_handler.py
  • Loading branch information
mrT23 committed Oct 19, 2024
1 parent e82afdd commit b743714
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion pr_agent/algo/ai_handlers/litellm_ai_handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,8 @@ async def chat_completion(self, model: str, system: str, user: str, temperature:

# Currently O1 does not support separate system and user prompts
O1_MODEL_PREFIX = 'o1-'
if model.startswith(O1_MODEL_PREFIX):
model_type = model.split('/')[-1] # 'azure/o1-' or 'o1-'
if model_type.startswith(O1_MODEL_PREFIX):
user = f"{system}\n\n\n{user}"
system = ""
get_logger().info(f"Using O1 model, combining system and user prompts")
Expand Down

0 comments on commit b743714

Please sign in to comment.