admin管理员组文章数量:1394740
I am using DSPy framework (v2.6.4) which uses liteLLM (v1.63.7) to connect to LLMs.
While connecting Azure OpenAI via liteLLM (v1.63.7) using the below method (Azure AD Token Refresh - DefaultAzureCredential),
from litellm import completion
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(DefaultAzureCredential(), "/.default")
response = completion(
model = "azure/<your deployment name>", # model = azure/<your deployment name>
api_base = "<api-url>", # azure api base
api_version = "<api-version>", # azure api version
azure_ad_token_provider=token_provider
messages = [{"role": "user", "content": "good morning"}],
)
I am getting the below error,
litellm\litellm_core_utils\exception_mapping_utils.py", line 2001, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AzureException AuthenticationError - Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource.
Connecting without liteLLM (via openai.AzureOpenAI) works fine but the same cred when used via liteLLM, I am getting authentication error.
Code that works
import openai
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(DefaultAzureCredential(), "/.default")
openai_client = openai.AzureOpenAI(
api_version="<---version--->",
azure_endpoint="<---endpoint--->",
azure_deployment="<---deployment_name--->",
azure_ad_token_provider=token_provider
)
def interact_with_model():
try:
response = openai_client.chatpletions.create(
model="gpt-4o",
messages=[ {"role": "system", "content": "You are a helpful assistant that helps me with my math homework!"}, {"role": "user", "content": "Hello! Could you solve 20 x 5?"} ],
max_tokens=100
)
print(response)
return response.choices[0].message.content
except Exception as e:
return f"Error: {e}"
if __name__ == "__main__":
response = interact_with_model()
print(f"Response from the model: {response}")
Anyone faced similar issues? Am I missing something here?
I am using DSPy framework (v2.6.4) which uses liteLLM (v1.63.7) to connect to LLMs.
While connecting Azure OpenAI via liteLLM (v1.63.7) using the below method (Azure AD Token Refresh - DefaultAzureCredential),
from litellm import completion
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure/.default")
response = completion(
model = "azure/<your deployment name>", # model = azure/<your deployment name>
api_base = "<api-url>", # azure api base
api_version = "<api-version>", # azure api version
azure_ad_token_provider=token_provider
messages = [{"role": "user", "content": "good morning"}],
)
I am getting the below error,
litellm\litellm_core_utils\exception_mapping_utils.py", line 2001, in exception_type
raise AuthenticationError(
litellm.exceptions.AuthenticationError: litellm.AuthenticationError: AzureException AuthenticationError - Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource.
Connecting without liteLLM (via openai.AzureOpenAI) works fine but the same cred when used via liteLLM, I am getting authentication error.
Code that works
import openai
from azure.identity import DefaultAzureCredential, get_bearer_token_provider
token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure/.default")
openai_client = openai.AzureOpenAI(
api_version="<---version--->",
azure_endpoint="<---endpoint--->",
azure_deployment="<---deployment_name--->",
azure_ad_token_provider=token_provider
)
def interact_with_model():
try:
response = openai_client.chatpletions.create(
model="gpt-4o",
messages=[ {"role": "system", "content": "You are a helpful assistant that helps me with my math homework!"}, {"role": "user", "content": "Hello! Could you solve 20 x 5?"} ],
max_tokens=100
)
print(response)
return response.choices[0].message.content
except Exception as e:
return f"Error: {e}"
if __name__ == "__main__":
response = interact_with_model()
print(f"Response from the model: {response}")
Anyone faced similar issues? Am I missing something here?
Share Improve this question edited Mar 28 at 5:34 4run4 asked Mar 27 at 8:08 4run44run4 254 bronze badges 5 |1 Answer
Reset to default 1litellm.AuthenticationError: AzureException AuthenticationError - Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource.
The above error occurred may be passing wrong access token or incorrect way passing token to access the endpoint.
You can use the below code it will fetch the response using LiteLLM using Python.
Code:
from litellm import completion
from azure.identity import DefaultAzureCredential
import json
# Get Azure AD Token
credential = DefaultAzureCredential()
token = credential.get_token("https://cognitiveservices.azure/.default").token
# Call liteLLM with the token
response = completion(
model="azure/<deployment name>",
api_base="<Resource endpoint>",
api_version="2023-05-15",
azure_ad_token=token, # Use azure_ad_token, not azure_ad_token_provider
messages=[{"role": "user", "content": "good morning"}],
)
print(json.dumps(response.model_dump(), indent=4))
Output:
{
"id": "cxxxcmpl-xxxxx",
"created": 1743074722,
"model": "xxxxxx",
"object": "chatpletion",
"system_fingerprint": "xxx",
"choices": [
{
"finish_reason": "stop",
"index": 0,
"message": {
"content": "Good morning! How can I assist you today?",
"role": "assistant",
"tool_calls": null,
"function_call": null
}
}
],
"usage": {
"completion_tokens": 11,
"prompt_tokens": 9,
"total_tokens": 20,
"completion_tokens_details": {
"accepted_prediction_tokens": 0,
"audio_tokens": 0,
"reasoning_tokens": 0,
"rejected_prediction_tokens": 0
},
"prompt_tokens_details": {
"audio_tokens": 0,
"cached_tokens": 0
}
},
"service_tier": null
}
Reference: Azure OpenAI | liteLLM
本文标签: azureConnecting AzureOpenAI via LiteLLMreturning Authentication Error 401Stack Overflow
版权声明:本文标题:azure - Connecting AzureOpenAI via LiteLLM - returning Authentication Error 401 - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744105031a2591025.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
credential = DefaultAzureCredential() token = credential.get_token("https://cognitiveservices.azure/.default").token
– Venkatesan Commented Mar 27 at 9:29