- 
                Notifications
    You must be signed in to change notification settings 
- Fork 19.5k
          fix(core): on_llm_new_token correct casting for .content to string
          #33498
        
          New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
  
    fix(core): on_llm_new_token correct casting for .content to string
  
  #33498
              Conversation
| CodSpeed Performance ReportMerging #33498 will not alter performanceComparing  
 | 
| run_manager.on_llm_new_token( | ||
| json.dumps(chunk.message.content), chunk=chunk | ||
| ) | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do we want to json.dumps the content or just call chunk.message.text as the original issue proposed?
my bias is to do chunk.message.text, which will represent tokens of text content. it will miss tokens from reasoning, tool calls, and other content, but I don't think on_llm_new_token was designed to support these well. so my thought is to go with well-defined behavior for now.
| This requires very careful design review. It's possible that the type is wrong and the callback handler actually needs to support all content not just text. We need to check how things are wired | 
| @eyurtsev right now it only supports text, do you mean we need to update  | 
| Yeah, I think @eyurtsev's point was that given how often this is used it's likely that coercing to string will break things, and that the original type is actually not accurate. But we'd need to scope the change. Closing for now to continue discussion on the original issue. | 
on_llm_new_tokenfunction should only accept values of typestring. However, the.contentattribute can sometimes be a list, so it’s unclear why it wasn’t being cast to a string. This PR addresses that by adding the necessary type casting.on_llm_new_tokengets list of dicts instead of string for Anthropic models #30703