Skip to content

Aegis dag #6197

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 45 commits into
base: main
Choose a base branch
from
Open

Aegis dag #6197

wants to merge 45 commits into from

Conversation

abhinav-aegis
Copy link

Why are these changes needed?

Related issue number

Checks

Helps close #4623 and #5131. I am creating a pull requested as suggested by @lspinheiro to be able to compare changes and provide comments. The code itself is not ready for merging.

You can find the different Graph Execution patterns in the test files: https://github.com/abhinav-aegis/autogen/blob/7ddfb088ac5a7da37d5af59dad92d6f216426169/python/packages/autogen-agentchat/tests/test_digraph_group_chat.py#L596

Message filtering is implemented here: https://github.com/abhinav-aegis/autogen/blob/7ddfb088ac5a7da37d5af59dad92d6f216426169/python/packages/autogen-agentchat/src/autogen_agentchat/teams/_group_chat/_chat_agent_container.py#L116

Structured message schema deserialization is here: https://github.com/abhinav-aegis/autogen/blob/aegis-dag/python/packages/autogen-agentchat/src/autogen_agentchat/utils/_structured_message_utils.py

See several tests for the deserialization here: https://github.com/abhinav-aegis/autogen/blob/aegis-dag/python/packages/autogen-agentchat/tests/test_structured_message_utils.py

See StructuredMessageComponent here: https://github.com/abhinav-aegis/autogen/blob/7ddfb088ac5a7da37d5af59dad92d6f216426169/python/packages/autogen-agentchat/src/autogen_agentchat/messages.py#L214

See serialization tests here: https://github.com/abhinav-aegis/autogen/blob/7ddfb088ac5a7da37d5af59dad92d6f216426169/python/packages/autogen-agentchat/tests/test_messages.py#L56

@victordibia
Copy link
Collaborator

victordibia commented Apr 4, 2025

Hi @abhinav-aegis ,

Thanks for putting this together, much appreciated. I am personally excited to see this take shape and ideally, it would be great to see it get to a point where we support it in AutoGen Studio.

That said, as we improve it, what is a good minimum testable example we can use to help readers of this thread get a good sense of what is being accomplished. Will also provide some framework for expected behaviour.

Perhaps something like the below (B and C below seem to not be responding with )?

from autogen_ext.models.openai._openai_client import OpenAIChatCompletionClient
from autogen_ext.models.openai._openai_client import OpenAIChatCompletionClient
from autogen_agentchat.agents import (
    AssistantAgent,
)
from autogen_agentchat.conditions import MaxMessageTermination
from autogen_agentchat.teams._group_chat._digraph_group_chat import DiGraphGroupChat, DiGraph, DiGraphNode, DiGraphEdge 


model_client = OpenAIChatCompletionClient(model="gpt-4o-mini")
agent_a = AssistantAgent("A", model_client=model_client, system_message="You are a helpful assistant.")
agent_b = AssistantAgent("B", model_client=model_client, system_message="You are a helpful spanish translator. Whenever you receive a message, translate it to Spanish and respond with the translation.")
agent_c = AssistantAgent("C", model_client=model_client, system_message="You are a helpful assistant markdown assistant. Whenever you receieve a message format it as markdown (use tables where appropriate) and respond with the formatted message.")

graph = DiGraph(
    nodes={
        "A": DiGraphNode(name="A", edges=[DiGraphEdge(target="B")]),
        "B": DiGraphNode(name="B", edges=[DiGraphEdge(target="C")]),
        "C": DiGraphNode(name="C", edges=[]),
    }
)

team = DiGraphGroupChat(
    participants=[agent_a, agent_b, agent_c],
    graph=graph,
    termination_condition=MaxMessageTermination(5),
)
 
stream =  team.run_stream(task="Write a 3 line haiku poem about the amount of rainfail each month for california.")
async for message in stream:
    print("********",message) 

Result

TaskResult(messages=[TextMessage(source='user', models_usage=None, metadata={}, content='Write a 3 line haiku poem about the amount of rainfail each month for california.', type='TextMessage'), TextMessage(source='A', models_usage=RequestUsage(prompt_tokens=37, completion_tokens=21), metadata={}, content="Winter's soft whispers,  \nSpring's vibrant blooms drink deeply,  \nSummer's drought holds sway.", type='TextMessage'), TextMessage(source='B', models_usage=RequestUsage(prompt_tokens=79, completion_tokens=27), metadata={}, content='Susurros de invierno,  \nLas flores vibrantes de primavera beben profundamente,  \nLa sequía del verano prevalece.', type='TextMessage'), TextMessage(source='C', models_usage=RequestUsage(prompt_tokens=118, completion_tokens=64), metadata={}, content="### Haiku about California Rainfall\n\n**English:**\n\nWinter's soft whispers,  \nSpring's vibrant blooms drink deeply,  \nSummer's drought holds sway.\n\n---\n\n**Spanish:**\n\nSusurros de invierno,  \nLas flores vibrantes de primavera beben profundamente,  \nLa sequía del verano prevalece.", type='TextMessage')], stop_reason='The DiGraph chat has finished executing.')

@abhinav-aegis
Copy link
Author

Hi @abhinav-aegis ,

Thanks for putting this together, much appreciated. I am personally excited to see this take shape and ideally, it would be great to see it get to a point where we support it in AutoGen Studio.

That said, as we improve it, what is a good minimum testable example we can use to help readers of this thread get a good sense of what is being accomplished. Will also provide some framework for expected behaviour.

Perhaps something like the below (B and C below seem to not be responding with )?

from autogen_ext.models.openai._openai_client import OpenAIChatCompletionClient
from autogen_ext.models.openai._openai_client import OpenAIChatCompletionClient
from autogen_agentchat.agents import (
    AssistantAgent,
)
from autogen_agentchat.conditions import MaxMessageTermination
from autogen_agentchat.teams._group_chat._digraph_group_chat import DiGraphGroupChat, DiGraph, DiGraphNode, DiGraphEdge 


model_client = OpenAIChatCompletionClient(model="gpt-4o-mini")
agent_a = AssistantAgent("A", model_client=model_client, system_message="You are a helpful assistant.")
agent_b = AssistantAgent("B", model_client=model_client, system_message="You are a helpful spanish translator. Whenever you receive a message, translate it to Spanish and respond with the translation.")
agent_c = AssistantAgent("C", model_client=model_client, system_message="You are a helpful assistant markdown assistant. Whenever you receieve a message format it as markdown (use tables where appropriate) and respond with the formatted message.")

graph = DiGraph(
    nodes={
        "A": DiGraphNode(name="A", edges=[DiGraphEdge(target="B")]),
        "B": DiGraphNode(name="B", edges=[DiGraphEdge(target="C")]),
        "C": DiGraphNode(name="C", edges=[]),
    }
)

team = DiGraphGroupChat(
    participants=[agent_a, agent_b, agent_c],
    graph=graph,
    termination_condition=MaxMessageTermination(5),
)
 
stream =  team.run_stream(task="Write a 3 line haiku poem about the amount of rainfail each month for california.")
async for message in stream:
    print("********",message) 

Result

TaskResult(messages=[TextMessage(source='user', models_usage=None, metadata={}, content='Write a 3 line haiku poem about the amount of rainfail each month for california.', type='TextMessage'), TextMessage(source='A', models_usage=RequestUsage(prompt_tokens=37, completion_tokens=21), metadata={}, content="Winter's soft whispers,  \nSpring's vibrant blooms drink deeply,  \nSummer's drought holds sway.", type='TextMessage'), TextMessage(source='B', models_usage=RequestUsage(prompt_tokens=79, completion_tokens=27), metadata={}, content='Susurros de invierno,  \nLas flores vibrantes de primavera beben profundamente,  \nLa sequía del verano prevalece.', type='TextMessage'), TextMessage(source='C', models_usage=RequestUsage(prompt_tokens=118, completion_tokens=64), metadata={}, content="### Haiku about California Rainfall\n\n**English:**\n\nWinter's soft whispers,  \nSpring's vibrant blooms drink deeply,  \nSummer's drought holds sway.\n\n---\n\n**Spanish:**\n\nSusurros de invierno,  \nLas flores vibrantes de primavera beben profundamente,  \nLa sequía del verano prevalece.", type='TextMessage')], stop_reason='The DiGraph chat has finished executing.')

@victordibia Thanks a lot for your feedback. I will definitely include such an example. I think the decision last week at Office Hours was to create an extension and maintain this library as an extension. I will get to that later this week and when I do that, I will be sure to include such an example in the Documentation.

@ekzhu
Copy link
Collaborator

ekzhu commented Apr 11, 2025

@abhinav-aegis, @victordibia and I discussed and we think it maybe good for us to create an experimental module in AgentChat instead.

@abhinav-aegis
Copy link
Author

@abhinav-aegis, @victordibia and I discussed and we think it maybe good for us to create an experimental module in AgentChat instead.

I am okay either ways - as a community extension or an experimental module. I will respond in Discord - easier to have a quick conversation there.

Copy link

codecov bot commented Apr 15, 2025

Codecov Report

Attention: Patch coverage is 88.88889% with 41 lines in your changes missing coverage. Please review.

Project coverage is 77.53%. Comparing base (71b7429) to head (754a20c).

Files with missing lines Patch % Lines
...agentchat/teams/_group_chat/_digraph_group_chat.py 85.45% 32 Missing ⚠️
...togen_agentchat/utils/_structured_message_utils.py 94.31% 5 Missing ⚠️
...utogen-agentchat/src/autogen_agentchat/messages.py 92.85% 2 Missing ⚠️
...chat/teams/_group_chat/_base_group_chat_manager.py 92.00% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #6197      +/-   ##
==========================================
+ Coverage   73.02%   77.53%   +4.51%     
==========================================
  Files         306      202     -104     
  Lines       17591    14816    -2775     
  Branches      406        0     -406     
==========================================
- Hits        12845    11487    -1358     
+ Misses       4473     3329    -1144     
+ Partials      273        0     -273     
Flag Coverage Δ
unittests 77.53% <88.88%> (+4.51%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Enable True Graph-Based Execution flow Pattern in AgentChat
3 participants