release notes
release notes
Published 2/11/2025
Contains new featuresIn this release we added a new built-in tool by @richard-gyiko for using Model Context Protocol (MCP) servers. MCP is an open protocol that allows agents to tap into an ecosystem of tools, from browsing file system to Git repo management.
Here is an example of using the mcp-server-fetch tool for fetching web content as Markdown.
# pip install mcp-server-fetch autogen-ext[mcp]
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.tools.mcp import StdioServerParams, mcp_server_tools
async def main() -> None:
# Get the fetch tool from mcp-server-fetch.
fetch_mcp_server = StdioServerParams(command="uvx", args=["mcp-server-fetch"])
tools = await mcp_server_tools(fetch_mcp_server)
# Create an agent that can use the fetch tool.
model_client = OpenAIChatCompletionClient(model="gpt-4o")
agent = AssistantAgent(name="fetcher", model_client=model_client, tools=tools, reflect_on_tool_use=True) # type: ignore
# Let the agent fetch the content of a URL and summarize it.
result = await agent.run(task="Summarize the content of https://en.wikipedia.org/wiki/Seattle")
print(result.messages[-1].content)
asyncio.run(main())
In this release we introduce a new built-in tool built by @EItanya for querying HTTP-based API endpoints. This lets agent call remotely hosted tools through HTTP.
Here is an example of using the httpbin.org API for base64 decoding.
# pip install autogen-ext[http-tool]
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.messages import TextMessage
from autogen_core import CancellationToken
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.tools.http import HttpTool
# Define a JSON schema for a base64 decode tool
base64_schema = {
"type": "object",
"properties": {
"value": {"type": "string", "description": "The base64 value to decode"},
},
"required": ["value"],
}
# Create an HTTP tool for the httpbin API
base64_tool = HttpTool(
name="base64_decode",
description="base64 decode a value",
scheme="https",
host="httpbin.org",
port=443,
path="/base64/{value}",
method="GET",
json_schema=base64_schema,
)
async def main():
# Create an assistant with the base64 tool
model = OpenAIChatCompletionClient(model="gpt-4")
assistant = AssistantAgent("base64_assistant", model_client=model, tools=[base64_tool])
# The assistant can now use the base64 tool to decode the string
response = await assistant.on_messages(
[TextMessage(content="Can you base64 decode the value 'YWJjZGU=', please?", source="user")],
CancellationToken(),
)
print(response.chat_message.content)
asyncio.run(main())
We introduced several improvements to MagenticOne (M1) and its agents. We made M1 work with text-only models that can't read screenshots, and prompt changes to make it work better with smaller models.
Do you know now you can configure m1 CLI tool with a YAML configuration file?
In this release we made several improvements to make SelectorGroupChat work well with smaller models such as LLama 13B, and hosted models that do not support the name field in Chat Completion messages.
Do you know you can use models served through Ollama directly through the OpenAIChatCompletionClient? See: https://microsoft.github.io/autogen/stable/user-guide/agentchat-user-guide/tutorial/models.html#ollama
We enhanced our support for Gemini models. Now you can use Gemini models without passing in model_info and base_url.
from autogen_core.models import UserMessage
from autogen_ext.models.openai import OpenAIChatCompletionClient
model_client = OpenAIChatCompletionClient(
model="gemini-1.5-flash-8b",
# api_key="GEMINI_API_KEY",
)
response = await model_client.create([UserMessage(content="What is the capital of France?", source="user")])
print(response)
Interested in integration with FastAPI? We have a new sample: https://github.com/microsoft/autogen/blob/main/python/samples/agentchat_fastapi
Full Changelog: https://github.com/microsoft/autogen/compare/python-v0.4.5...python-v0.4.6
release notes
Published 2/11/2025
Contains new featuresIn this release we added a new built-in tool by @richard-gyiko for using Model Context Protocol (MCP) servers. MCP is an open protocol that allows agents to tap into an ecosystem of tools, from browsing file system to Git repo management.
Here is an example of using the mcp-server-fetch tool for fetching web content as Markdown.
# pip install mcp-server-fetch autogen-ext[mcp]
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.tools.mcp import StdioServerParams, mcp_server_tools
async def main() -> None:
# Get the fetch tool from mcp-server-fetch.
fetch_mcp_server = StdioServerParams(command="uvx", args=["mcp-server-fetch"])
tools = await mcp_server_tools(fetch_mcp_server)
# Create an agent that can use the fetch tool.
model_client = OpenAIChatCompletionClient(model="gpt-4o")
agent = AssistantAgent(name="fetcher", model_client=model_client, tools=tools, reflect_on_tool_use=True) # type: ignore
# Let the agent fetch the content of a URL and summarize it.
result = await agent.run(task="Summarize the content of https://en.wikipedia.org/wiki/Seattle")
print(result.messages[-1].content)
asyncio.run(main())
In this release we introduce a new built-in tool built by @EItanya for querying HTTP-based API endpoints. This lets agent call remotely hosted tools through HTTP.
Here is an example of using the httpbin.org API for base64 decoding.
# pip install autogen-ext[http-tool]
import asyncio
from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.messages import TextMessage
from autogen_core import CancellationToken
from autogen_ext.models.openai import OpenAIChatCompletionClient
from autogen_ext.tools.http import HttpTool
# Define a JSON schema for a base64 decode tool
base64_schema = {
"type": "object",
"properties": {
"value": {"type": "string", "description": "The base64 value to decode"},
},
"required": ["value"],
}
# Create an HTTP tool for the httpbin API
base64_tool = HttpTool(
name="base64_decode",
description="base64 decode a value",
scheme="https",
host="httpbin.org",
port=443,
path="/base64/{value}",
method="GET",
json_schema=base64_schema,
)
async def main():
# Create an assistant with the base64 tool
model = OpenAIChatCompletionClient(model="gpt-4")
assistant = AssistantAgent("base64_assistant", model_client=model, tools=[base64_tool])
# The assistant can now use the base64 tool to decode the string
response = await assistant.on_messages(
[TextMessage(content="Can you base64 decode the value 'YWJjZGU=', please?", source="user")],
CancellationToken(),
)
print(response.chat_message.content)
asyncio.run(main())
We introduced several improvements to MagenticOne (M1) and its agents. We made M1 work with text-only models that can't read screenshots, and prompt changes to make it work better with smaller models.
Do you know now you can configure m1 CLI tool with a YAML configuration file?
In this release we made several improvements to make SelectorGroupChat work well with smaller models such as LLama 13B, and hosted models that do not support the name field in Chat Completion messages.
Do you know you can use models served through Ollama directly through the OpenAIChatCompletionClient? See: https://microsoft.github.io/autogen/stable/user-guide/agentchat-user-guide/tutorial/models.html#ollama
We enhanced our support for Gemini models. Now you can use Gemini models without passing in model_info and base_url.
from autogen_core.models import UserMessage
from autogen_ext.models.openai import OpenAIChatCompletionClient
model_client = OpenAIChatCompletionClient(
model="gemini-1.5-flash-8b",
# api_key="GEMINI_API_KEY",
)
response = await model_client.create([UserMessage(content="What is the capital of France?", source="user")])
print(response)
Interested in integration with FastAPI? We have a new sample: https://github.com/microsoft/autogen/blob/main/python/samples/agentchat_fastapi
Full Changelog: https://github.com/microsoft/autogen/compare/python-v0.4.5...python-v0.4.6
A programming framework for agentic AI