Skip to content

Conversation

@llenodo
Copy link

@llenodo llenodo commented Oct 25, 2025

What this does

Adds support for custom schema names and changes the default schema name to use the RubyLLM::Schema class name, if provided.

Screencast overview: https://www.loom.com/share/6cea87f03be94444bef28309a0ace3b0

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
    • For provider changes: Re-recorded VCR cassettes with bundle exec rake vcr:record[provider_name]
    • All tests pass: bundle exec rspec
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Related issues

Implements #475

llenodo and others added 4 commits October 23, 2025 21:20
OpenAI's response_format requires a 'name' for json_schema definitions.
This change enables custom schema names which are useful for debugging,
logging, and clarity when using multiple schemas.

Supports two formats:
1. Full: { name: 'custom_name', schema: { type: 'object', ... } }
2. Schema only: { type: 'object', ... } - defaults to 'response'

Changes:
- Updated OpenAI provider to extract schema name from wrapper format
- Added unit tests for schema name functionality
- Maintained backward compatibility (plain schemas default to 'response')

Related to schema naming discussion for better debugging and observability.
Previously, Chat#with_schema extracted only the [:schema] portion from
RubyLLM::Schema objects, discarding the class name. Now it preserves the
full hash, allowing schema class names to be automatically used in OpenAI API.

Changes:
- Updated Chat#with_schema to keep full to_json_schema result
- RubyLLM::Schema classes now automatically use their class name
- Added test for RubyLLM::Schema format with class name
- Manual wrappers and plain schemas still work as before

Example:
  class PersonSchema < RubyLLM::Schema
    string :name
    integer :age
  end

  chat.with_schema(PersonSchema)
  # OpenAI API receives: name: 'PersonSchema' (instead of 'response')
Documents the new custom schema name feature in the structured output
section, explaining how to use the full format with custom names and
when they're useful (influencing model behavior and debugging).

Also adds a note in the RubyLLM::Schema section explaining that schema
classes automatically use their class name in API requests.

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant