Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: DefaultAzureCredential not working #7665

Open
WGlenn43 opened this issue Jan 10, 2025 · 8 comments · Fixed by #7670
Open

[Bug]: DefaultAzureCredential not working #7665

WGlenn43 opened this issue Jan 10, 2025 · 8 comments · Fixed by #7670
Assignees
Labels
bug Something isn't working mlops user request

Comments

@WGlenn43
Copy link

What happened?

I tried to follow this guide. My exact setup for the yaml is:

model_list:
  - model_name: gpt-4o
    litellm_params:
      model: azure/gpt-4o
      api_base: https://[my_url].openai.azure.com/

litellm_settings:
    enable_azure_ad_token_refresh: true

Relevant log output

$ > litellm --config C:\litellm_config.yaml
INFO:     Started server process [19156]
INFO:     Waiting for application startup.

#------------------------------------------------------------#
#                                                            #
#               'A feature I really want is...'               #
#        https://github.com/BerriAI/litellm/issues/new        #
#                                                            #
#------------------------------------------------------------#

 Thank you for using LiteLLM! - Krrish & Ishaan



Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new


LiteLLM: Proxy initialized with Config, Set models:
    gpt-4o
ERROR:    Traceback (most recent call last):
  File "C:\home\starlette\routing.py", line 732, in lifespan
    async with self.lifespan_context(app) as maybe_state:
  File "C:\home\starlette\routing.py", line 608, in __aenter__
    await self._router.startup()
  File "C:\home\starlette\routing.py", line 709, in startup
    await handler()
  File "C:\home\litellm\proxy\proxy_server.py", line 3128, in startup_event
    await initialize(**worker_config)
  File "C:\home\litellm\proxy\proxy_server.py", line 2564, in initialize
    ) = await proxy_config.load_config(router=llm_router, config_file_path=config)
  File "C:\home\litellm\proxy\proxy_server.py", line 2029, in load_config
    router = litellm.Router(
  File "C:\home\litellm\router.py", line 389, in __init__
    self.set_model_list(model_list)
  File "C:\home\litellm\router.py", line 3968, in set_model_list
    self._create_deployment(
  File "C:\home\litellm\router.py", line 3887, in _create_deployment
    deployment = self._add_deployment(deployment=deployment)
  File "C:\home\litellm\router.py", line 4049, in _add_deployment
    InitalizeOpenAISDKClient.set_client(
  File "C:\home\litellm\router_utils\client_initalization_utils.py", line 373, in set_client
    _client = openai.AsyncAzureOpenAI(  # type: ignore
  File "C:\home\openai\lib\azure.py", line 409, in __init__
    raise OpenAIError(
openai.OpenAIError: Missing credentials. Please pass one of `api_key`, `azure_ad_token`, `azure_ad_token_provider`, or the `AZURE_OPENAI_API_KEY` or `AZURE_OPENAI_AD_TOKEN` environment variables.

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.53.9

Twitter / LinkedIn details

No response

@WGlenn43 WGlenn43 added the bug Something isn't working label Jan 10, 2025
@WGlenn43 WGlenn43 changed the title [Bug]: [Bug]: DefaultAzureCredential not working Jan 10, 2025
@krrishdholakia
Copy link
Contributor

Able to repro

@krrishdholakia krrishdholakia self-assigned this Jan 10, 2025
@krrishdholakia
Copy link
Contributor

Ah, the docs are missing details.

You need these keys

AZURE_CLIENT_ID=""
AZURE_CLIENT_SECRET=""
AZURE_TENANT_ID=""

in your environment

@krrishdholakia
Copy link
Contributor

adding those env vars solves the issue

Screenshot 2025-01-10 at 8 28 39 AM

@WGlenn43
Copy link
Author

WGlenn43 commented Jan 10, 2025

Ah, the docs are missing details.

You need these keys

AZURE_CLIENT_ID=""
AZURE_CLIENT_SECRET=""
AZURE_TENANT_ID=""

in your environment

This still does not seem to fix the issue. I get the exact same log output after setting these environment variables

$ > $env:AZURE_CLIENT_ID = ""
$ > $env:AZURE_CLIENT_SECRET = ""
$ > $env:AZURE_TENANT_ID = ""

$ > litellm --config C:\litellm_config.yaml
INFO:     Started server process [10632]
INFO:     Waiting for application startup.

#------------------------------------------------------------#
#                                                            #
#              'I don't like how this works...'               #
#        https://github.com/BerriAI/litellm/issues/new        #
#                                                            #
#------------------------------------------------------------#

 Thank you for using LiteLLM! - Krrish & Ishaan



Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new


LiteLLM: Proxy initialized with Config, Set models:
    gpt-4o
ERROR:    Traceback (most recent call last):
  File "C:\home\starlette\routing.py", line 732, in lifespan
    async with self.lifespan_context(app) as maybe_state:
  File "C:\home\starlette\routing.py", line 608, in __aenter__
    await self._router.startup()
  File "C:\home\starlette\routing.py", line 709, in startup
    await handler()
  File "C:\home\litellm\proxy\proxy_server.py", line 3128, in startup_event
    await initialize(**worker_config)
  File "C:\home\litellm\proxy\proxy_server.py", line 2564, in initialize
    ) = await proxy_config.load_config(router=llm_router, config_file_path=config)
  File "C:\home\litellm\proxy\proxy_server.py", line 2029, in load_config
    router = litellm.Router(
  File "C:\home\litellm\router.py", line 389, in __init__
    self.set_model_list(model_list)
  File "C:\home\litellm\router.py", line 3968, in set_model_list
    self._create_deployment(
  File "C:\home\litellm\router.py", line 3887, in _create_deployment
    deployment = self._add_deployment(deployment=deployment)
  File "C:\home\litellm\router.py", line 4049, in _add_deployment
    InitalizeOpenAISDKClient.set_client(
  File "C:\home\litellm\router_utils\client_initalization_utils.py", line 373, in set_client
    _client = openai.AsyncAzureOpenAI(  # type: ignore
  File "C:\home\openai\lib\azure.py", line 409, in __init__
    raise OpenAIError(
openai.OpenAIError: Missing credentials. Please pass one of `api_key`, `azure_ad_token`, `azure_ad_token_provider`, or the `AZURE_OPENAI_API_KEY` or `AZURE_OPENAI_AD_TOKEN` environment variables.

ERROR:    Application startup failed. Exiting.

rajatvig pushed a commit to rajatvig/litellm that referenced this issue Jan 16, 2025
* test(test_get_model_info.py): add unit test confirming router deployment updates global 'get_model_info'

* fix(get_supported_openai_params.py): fix custom llm provider 'get_supported_openai_params'

Fixes BerriAI#7668

* docs(azure.md): clarify how azure ad token refresh on proxy works

Closes BerriAI#7665
@WGlenn43
Copy link
Author

WGlenn43 commented Jan 16, 2025

@krrishdholakia I'm still hitting this issue. I've tried setting all 3 items you suggested to a blank string. I'm running on Windows Powershell. I'm happy to get on a call to debug if it helps

@krrishdholakia
Copy link
Contributor

@SeaDude
Copy link

SeaDude commented Jan 16, 2025

Hm...this still doesn't create a passwordless experience. We exclusively use the 'DefaultAzureCredential()' method from the azure-identity library. When users create Azure OpenAI clients with this method, RBAC roles govern access.

We don't want to create, communicate or manage either AOAI API keys OR Service Principal client_secrets. And really, there is no need if the DefaultAzureCredential() can be used.

This is a huge deal for enterprise customers. Let's get aider in the hands of more enterprise GenAI devs.

@krrishdholakia
Copy link
Contributor

When users create Azure OpenAI clients with this method

can you point me to how the azure openai client handles this @SeaDude

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mlops user request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants