Skip to content

Commit

Permalink
Merge branch 'main' into remove-paths-filter
Browse files Browse the repository at this point in the history
  • Loading branch information
kumaranvpl committed Feb 4, 2025
2 parents 387dc16 + cc74423 commit 6dec31b
Show file tree
Hide file tree
Showing 5 changed files with 163 additions and 30 deletions.
2 changes: 2 additions & 0 deletions autogen/agentchat/realtime/experimental/realtime_swarm.py
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,8 @@ def parse_oai_message(message: Union[dict[str, Any], str], role: str, adressee:
class SwarmableAgent:
"""A class for an agent that can participate in a swarm chat."""

__exported_module__ = ""

def __init__(
self,
name: str,
Expand Down
113 changes: 96 additions & 17 deletions notebook/agents_websurfer.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@
" \n",
" ```bash\n",
" playwright install\n",
" playwright install-deps\n",
" ```\n",
"\n",
"3. For running the code in Jupyther, use `nest_asyncio` to allow nested event loops.\n",
Expand Down Expand Up @@ -76,16 +77,48 @@
"\n",
"llm_config = {\n",
" \"config_list\": config_list,\n",
"}\n",
"}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"There are two ways to start a chat session which is using only one agent with LLM configuration.\n",
"\n",
"user_proxy = UserProxyAgent(name=\"user_proxy\", human_input_mode=\"NEVER\")\n",
"#### **Recommended:** Using the `run` Method\n",
"\n",
"The new `run` method simplifies the process by eliminating the need for manual `UserProxyAgent` creation.\n",
"\n",
"- ✅ **Easier setup** – No need to manually register tools"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# The `web_tool=\"browser_use\"` tells the agent to use the `BrowserUseTool` to surf the web.\n",
"websurfer = WebSurferAgent(name=\"WebSurfer\", llm_config=llm_config, web_tool=\"browser_use\")\n",
"\n",
"websurfer_tools = websurfer.tools\n",
"# WebSurferAgent has a list of tools which are registered for LLM\n",
"# We need to register the tools for execution with the UserProxyAgent\n",
"for tool in websurfer_tools:\n",
" tool.register_for_execution(user_proxy)"
"websurfer.run(\n",
" message=\"Get info from https://docs.ag2.ai/docs/Home\",\n",
" tools=websurfer.tools,\n",
" max_turns=2,\n",
" user_input=False,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### **Manual Setup:** Using `initiate_chat` Method\n",
"This method requires manually creating a `UserProxyAgent` and registering tools for execution.\n",
"\n",
"- ⚠️ **More setup required**\n",
"- ⚠️ **Must manually register tools**"
]
},
{
Expand All @@ -94,6 +127,13 @@
"metadata": {},
"outputs": [],
"source": [
"websurfer = WebSurferAgent(name=\"WebSurfer\", llm_config=llm_config, web_tool=\"browser_use\")\n",
"user_proxy = UserProxyAgent(name=\"user_proxy\", human_input_mode=\"NEVER\")\n",
"# WebSurferAgent has a list of tools which are registered for LLM\n",
"# We need to register the tools for execution with the UserProxyAgent\n",
"for tool in websurfer.tools:\n",
" tool.register_for_execution(user_proxy)\n",
"\n",
"user_proxy.initiate_chat(\n",
" recipient=websurfer,\n",
" message=\"Get info from https://docs.ag2.ai/docs/Home\",\n",
Expand All @@ -119,6 +159,7 @@
" \n",
" ```bash\n",
" playwright install\n",
" playwright install-deps\n",
" ```\n",
"\n",
"3. For running the code in Jupyther, use `nest_asyncio` to allow nested event loops.\n",
Expand Down Expand Up @@ -153,9 +194,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Crawl4AI WebSurferAgent\n",
"\n",
"The ONLY difference is the `web_tool` parameter which needs to be set to `crawl4ai`\n"
"### Crawl4AI WebSurferAgent"
]
},
{
Expand All @@ -168,16 +207,48 @@
"\n",
"llm_config = {\n",
" \"config_list\": config_list,\n",
"}\n",
"}"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"There are two ways to start a chat session which is using only one agent with LLM configuration.\n",
"\n",
"user_proxy = UserProxyAgent(name=\"user_proxy\", human_input_mode=\"NEVER\")\n",
"#### **Recommended:** Using the `run` Method\n",
"\n",
"The new `run` method simplifies the process by eliminating the need for manual `UserProxyAgent` creation.\n",
"\n",
"- ✅ **Easier setup** – No need to manually register tools\n"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"# `web_tool` parameter must be set to `crawl4ai` in order for the `Crawl4AITool` to be used.\n",
"websurfer = WebSurferAgent(name=\"WebSurfer\", llm_config=llm_config, web_tool=\"crawl4ai\")\n",
"\n",
"websurfer_tools = websurfer.tools\n",
"# WebSurferAgent has a list of tools which are registered for LLM\n",
"# We need to register the tools for execution with the UserProxyAgent\n",
"for tool in websurfer_tools:\n",
" tool.register_for_execution(user_proxy)"
"websurfer.run(\n",
" message=\"Get info from https://docs.ag2.ai/docs/Home\",\n",
" tools=websurfer.tools,\n",
" max_turns=2,\n",
" user_input=False,\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"#### **Manual Setup:** Using `initiate_chat` Method\n",
"This method requires manually creating a `UserProxyAgent` and registering tools for execution.\n",
"\n",
"- ⚠️ **More setup required**\n",
"- ⚠️ **Must manually register tools**"
]
},
{
Expand All @@ -186,6 +257,14 @@
"metadata": {},
"outputs": [],
"source": [
"user_proxy = UserProxyAgent(name=\"user_proxy\", human_input_mode=\"NEVER\")\n",
"websurfer = WebSurferAgent(name=\"WebSurfer\", llm_config=llm_config, web_tool=\"crawl4ai\")\n",
"\n",
"# WebSurferAgent has a list of tools which are registered for LLM\n",
"# We need to register the tools for execution with the UserProxyAgent\n",
"for tool in websurfer.tools:\n",
" tool.register_for_execution(user_proxy)\n",
"\n",
"user_proxy.initiate_chat(\n",
" recipient=websurfer,\n",
" message=\"Get info from https://docs.ag2.ai/docs/Home\",\n",
Expand Down
2 changes: 0 additions & 2 deletions test/agentchat/test_function_call.py
Original file line number Diff line number Diff line change
Expand Up @@ -252,7 +252,6 @@ def test_update_function(credentials_gpt_4o_mini: Credentials):
summary_method="reflection_with_llm",
)
messages1 = assistant.chat_messages[user_proxy][-1]["content"]
print(messages1)
print("Chat summary and cost", res1.summary, res1.cost)

assistant.update_function_signature("greet_user", is_remove=True)
Expand All @@ -262,7 +261,6 @@ def test_update_function(credentials_gpt_4o_mini: Credentials):
summary_method="reflection_with_llm",
)
messages2 = assistant.chat_messages[user_proxy][-1]["content"]
print(messages2)
# The model should know about the function in the context of the conversation
assert "greet_user" in messages1
assert "greet_user" not in messages2
Expand Down
2 changes: 0 additions & 2 deletions test/agentchat/test_tool_calls.py
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,6 @@ def test_update_tool(credentials_gpt_4o: Credentials):
message="What functions do you know about in the context of this conversation? End your response with 'TERMINATE'.",
)
messages1 = assistant.chat_messages[user_proxy][-1]["content"]
print("Message:", messages1)
print("Summary:", res.summary)
assert messages1.replace("TERMINATE", "") == res.summary, (
"Message (removing TERMINATE) and summary should be the same"
Expand All @@ -180,7 +179,6 @@ def test_update_tool(credentials_gpt_4o: Credentials):
summary_method="reflection_with_llm",
)
messages2 = assistant.chat_messages[user_proxy][-1]["content"]
print("Message2:", messages2)
# The model should know about the function in the context of the conversation
assert "greet_user" in messages1
assert "greet_user" not in messages2
Expand Down
74 changes: 65 additions & 9 deletions website/_blogs/2025-01-31-WebSurferAgent/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ To get started with the [`Browser Use`](https://github.com/browser-use/browser-u

```bash
playwright install
playwright install-deps
```

3. For running the code in Jupyter, use `nest_asyncio` to allow nested event loops.
Expand All @@ -74,26 +75,48 @@ nest_asyncio.apply()
```
### Configure WebSurferAgent with Browser Use
[`WebSurferAgent`](/docs/api-reference/autogen/agents/WebSurferAgent) is the one responsible for browsing the web and retrieving information. The `web_tool="browser_use"` tells the agent to use the [`BrowserUseTool`](/docs/api-reference/autogen/tools/experimental/BrowserUseTool) to surf the web.
After creating the [`WebSurferAgent`](/docs/api-reference/autogen/agents/WebSurferAgent), its tools (such as the [`BrowserUseTool`](/docs/api-reference/autogen/tools/experimental/BrowserUseTool)) are registered with the [`UserProxyAgent`](/docs/api-reference/autogen/UserProxyAgent) so they can be used during the interaction.
The user can ask the [`WebSurferAgent`](/docs/api-reference/autogen/agents/WebSurferAgent) to fetch information from a specific webpage, in this case, the AG2 documentation home page.
```python
config_list = [{"model": "gpt-4o-mini", "api_key": os.environ["OPENAI_API_KEY"]}]
llm_config = {
"config_list": config_list,
}
````
user_proxy = UserProxyAgent(name="user_proxy", human_input_mode="NEVER")
[`WebSurferAgent`](/docs/api-reference/autogen/agents/WebSurferAgent) is the one responsible for browsing the web and retrieving information. The `web_tool="browser_use"` tells the agent to use the [`BrowserUseTool`](/docs/api-reference/autogen/tools/experimental/BrowserUseTool) to surf the web.
After creating the [`WebSurferAgent`](/docs/api-reference/autogen/agents/WebSurferAgent) there are two ways to start the chat session:
#### **Recommended:** Using the [`run`](/docs/api-reference/autogen/ConversableAgent#run) Method
The new [`run`](/docs/api-reference/autogen/ConversableAgent#run) method simplifies the process by eliminating the need for manual [`UserProxyAgent`](/docs/api-reference/autogen/UserProxyAgent) creation.
- ✅ **Easier setup** – No need to manually register tools
```python
websurfer = WebSurferAgent(name="WebSurfer", llm_config=llm_config, web_tool="browser_use")
websurfer_tools = websurfer.tools
websurfer.run(
message="Get info from https://docs.ag2.ai/docs/Home",
tools=websurfer.tools,
max_turns=2,
user_input=False,
)
```
#### **Manual Setup:** Using [`initiate_chat`](/docs/api-reference/autogen/ConversableAgent#initiate-chat) Method
This method requires manually creating a [`UserProxyAgent`](/docs/api-reference/autogen/UserProxyAgent) and registering tools for execution.
- ⚠️ **More setup required**
- ⚠️ **Must manually register tools**
```python
websurfer = WebSurferAgent(name="WebSurfer", llm_config=llm_config, web_tool="browser_use")
user_proxy = UserProxyAgent(name="user_proxy", human_input_mode="NEVER")
# WebSurferAgent has a list of tools which are registered for LLM
# We need to register the tools for execution with the UserProxyAgent
for tool in websurfer_tools:
for tool in websurfer.tools:
tool.register_for_execution(user_proxy)
user_proxy.initiate_chat(
Expand All @@ -103,6 +126,7 @@ user_proxy.initiate_chat(
)
```
### Output
```console
user_proxy (to WebSurfer):
Expand Down Expand Up @@ -331,6 +355,7 @@ To integrate [`Crawl4AI`](https://github.com/unclecode/crawl4ai) with AG2, follo

```bash
playwright install
playwright install-deps
```
3. For running the code in Jupyter, use `nest_asyncio` to allow nested event loops.
```bash
Expand All @@ -353,15 +378,45 @@ nest_asyncio.apply()
```
### Configure WebSurferAgent with Crawl4AI
The only difference from the previous example is that the `web_tool` parameter must be set to `crawl4ai` in order for the [`Crawl4AITool`](/docs/api-reference/autogen/tools/experimental/Crawl4AITool) to be used.
```python
config_list = [{"model": "gpt-4o-mini", "api_key": os.environ["OPENAI_API_KEY"]}]
llm_config = {
"config_list": config_list,
}
```
The only difference from the previous example is that the `web_tool` parameter must be set to `crawl4ai` in order for the [`Crawl4AITool`](/docs/api-reference/autogen/tools/experimental/Crawl4AITool) to be used.
As in previous example, there are two ways to start the chat session:
#### **Recommended:** Using the [`run`](/docs/api-reference/autogen/ConversableAgent#run) Method
The new [`run`](/docs/api-reference/autogen/ConversableAgent#run) method simplifies the process by eliminating the need for manual [`UserProxyAgent`](/docs/api-reference/autogen/UserProxyAgent) creation.
- ✅ **Easier setup** – No need to manually register tools
```python
# `web_tool` parameter must be set to `crawl4ai` in order for the `Crawl4AITool` to be used.
websurfer = WebSurferAgent(name="WebSurfer", llm_config=llm_config, web_tool="crawl4ai")
websurfer.run(
message="Get info from https://docs.ag2.ai/docs/Home",
tools=websurfer.tools,
max_turns=2,
user_input=False,
)
```
#### **Manual Setup:** Using [`initiate_chat`](/docs/api-reference/autogen/ConversableAgent#initiate-chat) Method
This method requires manually creating a [`UserProxyAgent`](/docs/api-reference/autogen/UserProxyAgent) and registering tools for execution.
- ⚠️ **More setup required**
- ⚠️ **Must manually register tools**
```python
user_proxy = UserProxyAgent(name="user_proxy", human_input_mode="NEVER")
websurfer = WebSurferAgent(name="WebSurfer", llm_config=llm_config, web_tool="crawl4ai")
Expand All @@ -378,6 +433,7 @@ user_proxy.initiate_chat(
)
```
### Output
```console
user_proxy (to WebSurfer):
Expand Down

0 comments on commit 6dec31b

Please sign in to comment.