release notes
release notes
Published 2/17/2025
Contains new featuresThis release contains various bug fixes and feature improvements for the Python API.
Related news: our .NET API website is up and running: https://microsoft.github.io/autogen/dotnet/dev/. Our .NET Core API now has dev releases. Check it out!
Starting from v0.4.7, ModelInfo's required fields will be enforced. So please include all required fields when you use model_info when creating model clients. For example,
from autogen_core.models import UserMessage
from autogen_ext.models.openai import OpenAIChatCompletionClient
model_client = OpenAIChatCompletionClient(
model="llama3.2:latest",
base_url="http://localhost:11434/v1",
api_key="placeholder",
model_info={
"vision": False,
"function_calling": True,
"json_output": False,
"family": "unknown",
},
)
response = await model_client.create([UserMessage(content="What is the capital of France?", source="user")])
print(response)
See ModelInfo for more details.
strict mode support to BaseTool, ToolSchema and FunctionTool to allow tool calls to be used together with structured output mode by @ekzhu in https://github.com/microsoft/autogen/pull/5507Full Changelog: https://github.com/microsoft/autogen/compare/python-v0.4.6...python-v0.4.7
release notes
Published 2/17/2025
Contains new featuresThis release contains various bug fixes and feature improvements for the Python API.
Related news: our .NET API website is up and running: https://microsoft.github.io/autogen/dotnet/dev/. Our .NET Core API now has dev releases. Check it out!
Starting from v0.4.7, ModelInfo's required fields will be enforced. So please include all required fields when you use model_info when creating model clients. For example,
from autogen_core.models import UserMessage
from autogen_ext.models.openai import OpenAIChatCompletionClient
model_client = OpenAIChatCompletionClient(
model="llama3.2:latest",
base_url="http://localhost:11434/v1",
api_key="placeholder",
model_info={
"vision": False,
"function_calling": True,
"json_output": False,
"family": "unknown",
},
)
response = await model_client.create([UserMessage(content="What is the capital of France?", source="user")])
print(response)
See ModelInfo for more details.
strict mode support to BaseTool, ToolSchema and FunctionTool to allow tool calls to be used together with structured output mode by @ekzhu in https://github.com/microsoft/autogen/pull/5507Full Changelog: https://github.com/microsoft/autogen/compare/python-v0.4.6...python-v0.4.7
A programming framework for agentic AI