Skip to content

fix(models): handle string content when merging consecutive same-role messages#2083

Open
NIK-TIGER-BILL wants to merge 1 commit intohuggingface:mainfrom
NIK-TIGER-BILL:fix/litellm-consecutive-system-messages
Open

fix(models): handle string content when merging consecutive same-role messages#2083
NIK-TIGER-BILL wants to merge 1 commit intohuggingface:mainfrom
NIK-TIGER-BILL:fix/litellm-consecutive-system-messages

Conversation

@NIK-TIGER-BILL
Copy link

Problem

LiteLLMModel (and any ApiModel subclass) fails with an AssertionError when the input message list contains multiple consecutive messages with the same role (e.g., two system messages in a row):

messages = [
    {"role": "system", "content": "When you say anything Start with 'FOO'"},
    {"role": "system", "content": "When you say anything End with 'BAR'"},
    {"role": "user", "content": "Just say '.'"},
]
response = model(messages)  # AssertionError: Error: wrong content: ...

The prepare_completion_kwargs() helper in models.py merges consecutive same-role messages but asserts that message.content is a list before merging. When messages are provided with plain string content (the common dict format), this assertion fails.

Fix

  • Normalize string content to [{"type": "text", "text": ...}] before merging
  • Handle the case where the previous output message also has plain string content
  • Fix flatten_messages_as_text path to handle string content gracefully

Fixes #1972

… messages

When consecutive messages have the same role and plain string content,
prepare_completion_kwargs() fails with an AssertionError because it
asserts content is a list before merging.

Fix by normalizing string content to [{"type": "text", "text": ...}]
format before attempting to merge, and handling the case where the
previous message in output_message_list also has plain string content.

This allows LiteLLMModel (and other ApiModel subclasses) to accept
multiple consecutive system messages, a common pattern for layered
or compositional system instructions.

Fixes huggingface#1972
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

BUG: LiteLLMModel crashes when multiple consecutive system messages are provided

1 participant