Max context length exceeded in __rpc_call(method="generate_questions") #198
Unanswered
bhuvanrawal
asked this question in
Q&A
Replies: 1 comment 1 reply
-
So this happens if you have training data chunks that are too large and consume the entire context window. Try to do You may need some alternate strategies to break up the training data into smaller chunks. If you'd prefer not to do that, another alternative is to use your own OpenAI API Key and send it to the new GPT-4-Turbo with 128k context, which should be able to accept these larger chunks. https://vanna.ai/docs/snowflake-openai-standard-vannadb.html |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
{'error': {'code': -32603, 'message': 'BadRequestError(\'Error code: 400 - {\\\'error\\\': {\\\'message\\\': "This model\\\'s maximum context length is 16385 tokens. However, you requested 16583 tokens (16083 in the messages, 500 in the completion). Please reduce the length of the messages or completion.", \\\'type\\\': \\\'invalid_request_error\\\', \\\'param\\\': \\\'messages\\\', \\\'code\\\': \\\'context_length_exceeded\\\'}}\')'}, 'jsonrpc': '2.0'}
Beta Was this translation helpful? Give feedback.
All reactions