-
Notifications
You must be signed in to change notification settings - Fork 3.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Python: Function Calling Error when Using APIM Managed Azure OpenAI #10182
Comments
A bad request means that something formed in the request body is not accepted. Can you please give us some more details about the request being made to the model? |
I am able to reproduce the issue with a very simple function call script below.
Let me know if there is anything else I need to provide. |
Hi @awonglk, thanks for the added context. Can you please make sure you're following the follow flow when using the APIM auth: |
The only thing I'm not following from that link is I'm not using "OnBehalfOfCredential". Is using OnBehalfOfCredential auth a must? |
Alright, I understand. Apologies to try and lead you down the wrong path. I'll need to figure out how to replicate your issue. For further context, have you followed this step? Pass the API key in API requests - set-backend-service policy. |
@awonglk, based on this previous comment - a user was experiencing a function calling error using APIM in SK .Net -- could it be that your model you're calling doesn't support function calling? |
Or this option as well: #8340 (comment) AzureChatCompletion(
api_key=`Value of Ocp-Apim-SubscriptionKey`
endpoint=AZURE_OPENAI_ENDPOINT, # your endpoint, which doesn't include `deployments/gpt-4o/chat/completions?api-version=2024-02-15-preview`
deployment_name=AZURE_OPENAI_CHAT_DEPLOYMENT_NAME, # gpt-4o
api_version='2024-02-15-preview'
default_headers={'Ocp-Apim-SubscriptionKey': <value>}
} |
For your 3 x comments above:
|
Which Azure OpenAI API version are you using? For example, I know APIM is a part of this, but the model is not failing with a 401/403, so it doesn't look like an auth issue.
In the "it works" case, are you including tools in your request? Or are you removing tools? |
Discussed in #10176
Originally posted by awonglk January 14, 2025
I've followed the article pointed by following issue to try to get my Semantic Kernel app working with APIM managed Azure OpenAI:
#7143
If there's no function calls involved, the responses back from LLM seems to work as normal.
But as soon as I ask a question that involves a plugin (even the core plugin time_plugin() as an example)
This is what I get when asking a simple question like "What is the time?"
<class 'semantic_kernel.connectors.ai.open_ai.services.azure_chat_completion.AzureChatCompletion'> service failed to complete the prompt\", BadRequestError('Error code: 400 - {\\'statusCode\\': 400, \\'message\\': \"Unable to parse and estimate tokens from incoming request. Please ensure incoming request does not contain any images and is of one of the following types: \\'Chat Completion\\', \\'Completion\\', \\'Embeddings\\' and works with current prompt estimation mode of \\'Auto\\'.\"}'))
Is there anything obvious that I may have missed?
Using semantic-kernel 1.16.0
The text was updated successfully, but these errors were encountered: