Skip to content

Commit

Permalink
fix chat template bug
Browse files Browse the repository at this point in the history
  • Loading branch information
Blaizzy committed Aug 16, 2024
1 parent fe842f3 commit bef76d8
Showing 1 changed file with 1 addition and 3 deletions.
4 changes: 1 addition & 3 deletions mlx_vlm/prompt_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,7 @@ def get_message_json(model_name, prompt):
def apply_chat_template(processor, config, prompt):
message = get_message_json(config["model_type"], prompt)

if "chat_template" in processor.__dict__.keys() and hasattr(
processor, "default_chat_template"
):
if "chat_template" in processor.__dict__.keys():
return processor.apply_chat_template(
[message],
tokenize=False,
Expand Down

0 comments on commit bef76d8

Please sign in to comment.