Agent DailyAgent Daily
releaseintermediate

[Release] microsoft/autogen python-v0.6.0: python-v0.6.0

By ekzhugithub
View original on github

Microsoft AutoGen v0.6.0 introduces concurrent agent execution in GraphFlow with fan-out-fan-in patterns, callable conditions for edge routing, and a new OpenAIAgent backed by OpenAI's Response API. Key improvements include enhanced MCP support, AssistantAgent workbench capabilities, code executor optimizations, and expanded model client compatibility across Ollama, Anthropic Bedrock, and OpenAI platforms.

Key Points

  • BaseGroupChatManager.select_speaker now returns List[str] | str instead of just str, enabling concurrent agent execution in GraphFlow with fan-out-fan-in patterns
  • GraphFlow supports concurrent agents running in separate coroutines, allowing multiple agents (e.g., B and C) to execute simultaneously after a parent agent (A) completes
  • Callable conditions (lambda functions) can now be used for GraphFlow edge conditions, replacing keyword substring-based conditions to prevent 'cannot find next agent' errors
  • New OpenAIAgent added, backed by OpenAI Response API for enhanced agent capabilities
  • MCP (Model Context Protocol) now supports Streamable HTTP transport for improved integration
  • AssistantAgent improvements include tool_call_summary_msg_format_fct customization and multiple workbenches support
  • Code executors enhanced with auto-delete temporary files option in LocalCommandLineCodeExecutor and improved output handling in Docker Jupyter executor
  • Multiple client improvements: OpenAIChatCompletionClient adds Llama API support and streaming response statistics; OllamaChatCompletionClient adds Qwen3 support; AnthropicBedrockChatCompletionClient supports implicit AWS credentials
  • MagenticOneGroupChat now uses structured output for orchestrator, improving reliability and consistency
  • Timestamps (created_at) added to BaseChatMessage and BaseAgentEvent for better event tracking and debugging

Found this useful? Add it to a playbook for a step-by-step implementation guide.

Workflow Diagram

Start Process
Step A
Step B
Step C
Complete
Quality

Concepts

Artifacts (2)

GraphFlow Concurrent Agents Examplepythonscript
import asyncio

from autogen_agentchat.agents import AssistantAgent
from autogen_agentchat.conditions import MaxMessageTermination
from autogen_agentchat.teams import DiGraphBuilder, GraphFlow
from autogen_ext.models.openai import OpenAIChatCompletionClient


async def main():
    # Initialize agents with OpenAI model clients.
    model_client = OpenAIChatCompletionClient(model="gpt-4.1-nano")
    agent_a = AssistantAgent("A", model_client=model_client, system_message="You are a helpful assistant.")
    agent_b = AssistantAgent("B", model_client=model_client, system_message="Translate input to Chinese.")
    agent_c = AssistantAgent("C", model_client=model_client, system_message="Translate input to Japanese.")

    # Create a directed graph with fan-out flow A -> (B, C).
    builder = DiGraphBuilder()
    builder.add_node(agent_a).add_node(agent_b).add_node(agent_c)
    builder.add_edge(agent_a, agent_b).add_edge(agent_a, agent_c)
    graph = builder.build()

    # Create a GraphFlow team with the directed graph.
    team = GraphFlow(
        participants=[agent_a, agent_b, agent_c],
        graph=graph,
        termination_condition=MaxMessageTermination(5),
    )

    # Run the team and print the events.
    async for event in team.run_stream(task="Write a short story about a cat."):
        print(event)


asyncio.run(main())
BaseGroupChatManager select_speaker Signature Changepythontemplate
# Original signature:
async def select_speaker(self, thread: Sequence[BaseAgentEvent | BaseChatMessage]) -> str:
  ...

# New signature:
async def select_speaker(self, thread: Sequence[BaseAgentEvent | BaseChatMessage]) -> List[str] | str:
  ...
[Release] microsoft/autogen python-v0.6.0: python-v0.6.0 | Agent Daily