Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[COG-948] Fix ContextWindowExceededError in CodeGraph #405

Open
lxobr opened this issue Jan 6, 2025 · 0 comments
Open

[COG-948] Fix ContextWindowExceededError in CodeGraph #405

lxobr opened this issue Jan 6, 2025 · 0 comments
Assignees
Labels
2 points Created by Linear-GitHub Sync
Milestone

Comments

@lxobr
Copy link
Collaborator

lxobr commented Jan 6, 2025

instructor.exceptions.InstructorRetryException: 
litellm.BadRequestError: 
litellm.ContextWindowExceededError: 
ContextWindowExceededError: 
OpenAIException - Error code: 400  
{
  'error': 
    {
      'message': "This model's maximum context length is 128000 tokens. However, your messages resulted in 150820 tokens (150655 in the messages, 165 in the functions). Please reduce the length of the messages or functions.", 
      'type': 'invalid_request_error', 
      'param': 'messages', 
      'code': 'context_length_exceeded'
    }
}
  • A check should be implemented for the length of request message that we are sending.
  • We can use tiktoken to count the tokens.
  • Large requests should be chunked into smaller requests.

From SyncLinear.com | COG-948

@lxobr lxobr added this to the v.24 milestone Jan 6, 2025
@lxobr lxobr added the 2 points Created by Linear-GitHub Sync label Jan 6, 2025
@lxobr lxobr closed this as completed Jan 7, 2025
@alekszievr alekszievr reopened this Jan 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
2 points Created by Linear-GitHub Sync
Projects
None yet
Development

No branches or pull requests

2 participants