You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In #5425 , I mentioned that the chat template can be (ideally) detected using model metadata tokenizer.chat_template, but at that time, I didn't know that it is possible to access the metadata
Now that we have llama_chat_apply_template, we no longer have to worry about metadata. We can use this new function to format the chat supplied to /v1/chat/completions
The text was updated successfully, but these errors were encountered:
Depends on #5538 to be merged
In #5425 , I mentioned that the chat template can be (ideally) detected using model metadata
tokenizer.chat_template
, but at that time, I didn't know that it is possible to access the metadataNow that we have
llama_chat_apply_template
, we no longer have to worry about metadata. We can use this new function to format the chat supplied to/v1/chat/completions
The text was updated successfully, but these errors were encountered: