Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Getting error while using pr-agent for azure(ado) pull requests. #7644

Open
chandrakanth7 opened this issue Jan 9, 2025 · 1 comment
Open

Comments

@chandrakanth7
Copy link

2025-01-08 23:30:19.165 | INFO | pr_agent.git_providers.azuredevops_provider:get_diff_files:373 - Invalid files: []
2025-01-08 23:30:19.657 | INFO | pr_agent.tools.pr_reviewer:run:111 - Reviewing PR: https://dev.azure.com/SalientMinds/xnode/_git/python.storage.api/pullrequest/13791 ...
2025-01-08 23:30:19.829 | INFO | pr_agent.algo.pr_processing:get_pr_diff:63 - PR main language: Other
2025-01-08 23:30:19.839 | INFO | pr_agent.algo.pr_processing:get_pr_diff:74 - Tokens: 1302, total tokens under limit: 32000, returning full diff.

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:19.894 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:19.911 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:19.923 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:19.935 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:19.950 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:
2025-01-08 23:30:19.957 | WARNING | pr_agent.algo.pr_processing:retry_with_fallback_models:331 - Failed to generate prediction with gpt-4-turbo-2024-04-09: Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\main.py", line 401, in acompletion
response = await init_response
^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\llms\openai.py", line 1124, in acompletion
raise e
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\llms\openai.py", line 1055, in acompletion
openai_aclient = self._get_openai_client(
^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\llms\openai.py", line 734, in _get_openai_client
_new_client: Union[OpenAI, AsyncOpenAI] = AsyncOpenAI(
^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\openai_client.py", line 337, in init
super().init(
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\openai_base_client.py", line 1419, in init
self._client = http_client or AsyncHttpxClientWrapper(
^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\openai_base_client.py", line 1316, in init
super().init(**kwargs)
TypeError: AsyncClient.init() got an unexpected keyword argument 'proxies'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\tenacity_asyncio.py", line 50, in call
result = await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\pr_agent\algo\ai_handlers\litellm_ai_handler.py", line 216, in chat_completion
response = await acompletion(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\utils.py", line 1579, in wrapper_async
raise e
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\utils.py", line 1399, in wrapper_async
result = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\main.py", line 424, in acompletion
raise exception_type(
^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\utils.py", line 8305, in exception_type
raise e # it's already mapped
^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\utils.py", line 6723, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: APIConnectionError: OpenAIException - AsyncClient.init() got an unexpected keyword argument 'proxies'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\pr_agent\algo\pr_processing.py", line 329, in retry_with_fallback_models
return await f(model)
^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\pr_agent\tools\pr_reviewer.py", line 163, in _prepare_prediction
self.prediction = await self._get_prediction(model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\pr_agent\tools\pr_reviewer.py", line 185, in get_prediction
response, finish_reason = await self.ai_handler.chat_completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\tenacity_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\tenacity_asyncio.py", line 47, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\tenacity_init
.py", line 326, in iter
raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x217146c1ad0 state=finished raised APIConnectionError>]

2025-01-08 23:30:20.051 | INFO | pr_agent.algo.pr_processing:get_pr_diff:63 - PR main language: Other
2025-01-08 23:30:20.059 | INFO | pr_agent.algo.pr_processing:get_pr_diff:74 - Tokens: 1302, total tokens under limit: 32000, returning full diff.
Task exception was never retrieved
future: <Task finished name='Task-9' coro=<AsyncClient.aclose() done, defined at c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py:1978> exception=AttributeError("'AsyncHttpxClientWrapper' object has no attribute '_state'")>
Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py", line 1982, in aclose
if self._state != ClientState.CLOSED:
^^^^^^^^^^^
AttributeError: 'AsyncHttpxClientWrapper' object has no attribute '_state'
Task exception was never retrieved
future: <Task finished name='Task-10' coro=<AsyncClient.aclose() done, defined at c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py:1978> exception=AttributeError("'AsyncHttpxClientWrapper' object has no attribute '_state'")>
Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py", line 1982, in aclose
if self._state != ClientState.CLOSED:
^^^^^^^^^^^
AttributeError: 'AsyncHttpxClientWrapper' object has no attribute '_state'
Task exception was never retrieved
future: <Task finished name='Task-11' coro=<AsyncClient.aclose() done, defined at c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py:1978> exception=AttributeError("'AsyncHttpxClientWrapper' object has no attribute '_state'")>
Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py", line 1982, in aclose
if self._state != ClientState.CLOSED:
^^^^^^^^^^^
AttributeError: 'AsyncHttpxClientWrapper' object has no attribute '_state'
Task exception was never retrieved
future: <Task finished name='Task-12' coro=<AsyncClient.aclose() done, defined at c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py:1978> exception=AttributeError("'AsyncHttpxClientWrapper' object has no attribute '_state'")>
Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py", line 1982, in aclose
if self._state != ClientState.CLOSED:
^^^^^^^^^^^
AttributeError: 'AsyncHttpxClientWrapper' object has no attribute '_state'
Task exception was never retrieved
future: <Task finished name='Task-13' coro=<AsyncClient.aclose() done, defined at c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py:1978> exception=AttributeError("'AsyncHttpxClientWrapper' object has no attribute '_state'")>
Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\httpx_client.py", line 1982, in aclose
if self._state != ClientState.CLOSED:
^^^^^^^^^^^
AttributeError: 'AsyncHttpxClientWrapper' object has no attribute '_state'

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:20.101 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:20.115 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:20.130 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:20.142 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new
LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

2025-01-08 23:30:20.156 | WARNING | pr_agent.algo.ai_handlers.litellm_ai_handler:chat_completion:218 - Error during OpenAI inference:
2025-01-08 23:30:20.163 | WARNING | pr_agent.algo.pr_processing:retry_with_fallback_models:331 - Failed to generate prediction with gpt-4o-2024-05-13: Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\main.py", line 401, in acompletion
response = await init_response
^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\llms\openai.py", line 1124, in acompletion
raise e
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\llms\openai.py", line 1055, in acompletion
openai_aclient = self._get_openai_client(
^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\llms\openai.py", line 734, in _get_openai_client
_new_client: Union[OpenAI, AsyncOpenAI] = AsyncOpenAI(
^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\openai_client.py", line 337, in init
super().init(
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\openai_base_client.py", line 1419, in init
self._client = http_client or AsyncHttpxClientWrapper(
^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\openai_base_client.py", line 1316, in init
super().init(**kwargs)
TypeError: AsyncClient.init() got an unexpected keyword argument 'proxies'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\tenacity_asyncio.py", line 50, in call
result = await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\pr_agent\algo\ai_handlers\litellm_ai_handler.py", line 216, in chat_completion
response = await acompletion(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\utils.py", line 1579, in wrapper_async
raise e
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\utils.py", line 1399, in wrapper_async
result = await original_function(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\main.py", line 424, in acompletion
raise exception_type(
^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\utils.py", line 8305, in exception_type
raise e # it's already mapped
^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\litellm\utils.py", line 6723, in exception_type
raise APIConnectionError(
litellm.exceptions.APIConnectionError: litellm.APIConnectionError: APIConnectionError: OpenAIException - AsyncClient.init() got an unexpected keyword argument 'proxies'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\pr_agent\algo\pr_processing.py", line 329, in retry_with_fallback_models
return await f(model)
^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\pr_agent\tools\pr_reviewer.py", line 163, in _prepare_prediction
self.prediction = await self._get_prediction(model)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\pr_agent\tools\pr_reviewer.py", line 185, in get_prediction
response, finish_reason = await self.ai_handler.chat_completion(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\tenacity_asyncio.py", line 88, in async_wrapped
return await fn(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\tenacity_asyncio.py", line 47, in call
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\chandrakanthreddy.ch\Desktop\code\python.code.api.venv\Lib\site-packages\tenacity_init
.py", line 326, in iter
raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x2171446db90 state=finished raised APIConnectionError>]

2025-01-08 23:30:20.172 | ERROR | pr_agent.tools.pr_reviewer:run:152 - Failed to review PR: RetryError[<Future at 0x2171446db90 state=finished raised APIConnectionError>]

@jkkjjj
Copy link

jkkjjj commented Jan 10, 2025

I've encountered the same issue as well.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants