Fix chat_template crash when assistant message omits the `content` key
Fix chat_template to handle assistant messages without a content key
The problem
When an assistant message has tool_calls but no content key, the template crashes:
UndefinedError: 'dict object' has no attribute 'content'
Today this is rare because most callers pass content=None explicitly, which the template handles. But the upcoming transformers change https://github.com/huggingface/transformers/pull/45422 will strip content=None before rendering (treating None and absent as equivalent — which matches what the OpenAI API returns for tool-call-only messages). Once it ships, every tool-calling request through this template will fail.
Repro
Today (works):
from transformers import AutoTokenizer
tok = AutoTokenizer.from_pretrained("qgallouedec/DeepSeek-R1")
tok.apply_chat_template(
[
{"role": "user", "content": "What's the weather in Paris?"},
{"role": "assistant", "content": None, "tool_calls": [{
"type": "function",
"function": {"name": "get_weather", "arguments": '{"city":"Paris"}'},
}]},
],
tokenize=False,
)
# renders correctly
After https://github.com/huggingface/transformers/pull/45422 (same call, same input — transformers strips content=None before rendering, so the template sees an absent key and crashes):
UndefinedError: 'dict object' has no attribute 'content'
You can reproduce the post-PR behavior today by simply omitting the content key.
The fix
One character: message['content'] is none → message.get('content') is none. The .get() returns None whether the key is absent or set to None, so both cases are handled identically.
Verified against 14 message-shape variants: 13 render bit-identically to the current template, 1 (the crashing case above) now renders correctly. No regressions.