Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Addresses Issue #245 for better logging #332

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

patilvishal0597
Copy link

Added logger_util to enable package and class wide logging in langchain-aws. Added logging for invoke and ainvoke.

Added a logger util for configuration of logger. Tested logging of invoke, ainvoke, and converse request/responses

Behaviour:

LANGCHAIN_AWS_DEBUG = LANGCHAIN_AWS_DEBUG_ROOT = True --> Logs debug messages across boto3 and application

LANGCHAIN_AWS_DEBUG = True; LANGCHAIN_AWS_DEBUG_ROOT = True --> Logs only applcation debug messages

LANGCHAIN_AWS_DEBUG = False; LANGCHAIN_AWS_DEBUG_ROOT = True --> Logs only application info messages

LANGCHAIN_AWS_DEBUG = LANGCHAIN_AWS_DEBUG_ROOT = False --> Logs only application info messages

Invoke call:

llm = ChatBedrock(
        model_id="us.anthropic.claude-3-5-sonnet-20241022-v2:0",
        region_name="us-east-1",
        model_kwargs={
            "max_tokens": 100,
            "top_p": 0.9,
            "temperature": 0.1,
        },
    )

# input to llm
messages = [
        (
            "system",
            "You are a helpful assistant that translates English to French. Translate the user sentence.",
        ),
        ("human", "I love going out for a walk when the weather is bright and sunny."),
    ]

# Invoke the llm
response = llm.invoke(messages)

Logging output:

Debug logs with LANGCHAIN_AWS_DEBUG and LANGCHAIN_AWS_DEBUG_ROOT env vars as True

...
...
....
2025-01-15 14:30:31 DEBUG | [connectionpool.py:1051] | urllib3.connectionpool - Starting new HTTPS connection (1): bedrock-runtime.us-east-1.amazonaws.com:443
2025-01-15 14:30:33 DEBUG | [connectionpool.py:546] | urllib3.connectionpool - https://bedrock-runtime.us-east-1.amazonaws.com:443 "POST /model/us.anthropic.claude-3-5-sonnet-20241022-v2%3A0/invoke HTTP/11" 200 299
2025-01-15 14:30:33 DEBUG | [parsers.py:241] | botocore.parsers - Response headers: {'Date': 'Wed, 15 Jan 2025 22:30:33 GMT', 'Content-Type': 'application/json', 'Content-Length': '299', 'Connection': 'keep-alive', 'x-amzn-RequestId': 'xxxxxx', 'X-Amzn-Bedrock-Invocation-Latency': '1077', 'X-Amzn-Bedrock-Output-Token-Count': '22', 'X-Amzn-Bedrock-Input-Token-Count': '40'}
2025-01-15 14:30:33 DEBUG | [parsers.py:242] | botocore.parsers - Response body:
<botocore.response.StreamingBody object at 0x108e595d0>
2025-01-15 14:30:33 DEBUG | [hooks.py:238] | botocore.hooks - Event needs-retry.bedrock-runtime.InvokeModel: calling handler <botocore.retryhandler.RetryHandler object at 0x108e58c80>
2025-01-15 14:30:33 DEBUG | [retryhandler.py:211] | botocore.retryhandler - No retry needed.
2025-01-15 14:30:33 INFO | [bedrock.py:603] | root - The output message sent by user: content="J'aime me promener quand il fait beau et ensoleillé." additional_kwargs={'usage': {'prompt_tokens': 40, 'completion_tokens': 22, 'total_tokens': 62}, 'stop_reason': 'end_turn', 'model_id': 'us.anthropic.claude-3-5-sonnet-20241022-v2:0'} response_metadata={} usage_metadata={'input_tokens': 40, 'output_tokens': 22, 'total_tokens': 62}
J'aime me promener quand il fait beau et ensoleillé.
(base) vishankp@7cf34de71c79 aws % 

Logging output when LANGCHAIN_AWS_DEBUG and LANGCHAIN_AWS_DEBUG_ROOT flags are False and logger is initialized with a module_name:

2025-01-15 14:39:40 INFO | langchain_aws.chat_models.bedrock - The input message sent by user: [SystemMessage(content='You are a helpful assistant that translates English to French. Translate the user sentence.', additional_kwargs={}, response_metadata={}), HumanMessage(content='I love going out for a walk when the weather is bright and sunny.', additional_kwargs={}, response_metadata={})]
2025-01-15 14:39:40 ERROR | langchain_aws.chat_models.bedrock - Testing error log
2025-01-15 14:39:41 INFO | langchain_aws.chat_models.bedrock - The output message sent by user: content="J'aime me promener quand il fait beau et ensoleillé." additional_kwargs={'usage': {'prompt_tokens': 40, 'completion_tokens': 22, 'total_tokens': 62}, 'stop_reason': 'end_turn', 'model_id': 'us.anthropic.claude-3-5-sonnet-20241022-v2:0'} response_metadata={} usage_metadata={'input_tokens': 40, 'output_tokens': 22, 'total_tokens': 62}
J'aime me promener quand il fait beau et ensoleillé.

Added `logger_util` to enable package and class wide logging in `langchain-aws`.
Added logging for `invoke` and `ainvoke`.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant