insights
What shown here is the Two-agent chat pattern, which is one of the many Conversation Patterns in Agentic AI by Autogen. You can customize the agent's system prompts to suit your needs.
Two-agent chat is the simplest form of conversation pattern, where two agents chat with each other.
Besides this pattern, there are other patterns such as:
Sequential chat: A series of dialogues between two agents, linked by a system that transfers the summary from one chat to inform the next.
Group chat: A single conversation that includes multiple agents.
Nested chat: Consolidates your workflow into one streamlined agent, making it easy to incorporate into larger processes.
According to Autogen's documentation, a Sequential chat involves a sequence of interactions between two agents, connected through a carryover mechanism. This mechanism brings the summary of the previous chat into the context of the next one.
Here's the plan for the agents we will create:
Collect Name Agent - Asks for the customer's name.
Burger Patties Preference Agent - Inquires about the customer's preferred type of burger.
Fun Facts Agent - Shares an interesting fact based on the customer's selected patty.
So, in this sequence, there are interactions between two agents at each step:
Step 1: Collect Name Agent and Customer
Step 2: Burger Patties Preference Agent and Customer
Step 3: Fun Facts Agent and Customer
These agents will work together seamlessly to create an engaging and personalized experience for the customer. Below is a diagram that illustrates this agentic process.
Let's look at the outcome. You can refer to the setup in the documentation for sequential chat or the course. The outcome may be different from what you see here since the LLM is non-deterministic.
{
"usage_including_cached_inference": {
"total_cost": 0,
"gpt-4o-2024-05-13": {
"cost": 0,
"prompt_tokens": 50,
"completion_tokens": 6,
"total_tokens": 56
}
},
"usage_excluding_cached_inference": {
"total_cost": 0,
"gpt-4o-2024-05-13": {
"cost": 0,
"prompt_tokens": 50,
"completion_tokens": 6,
"total_tokens": 56
}
}
}
In the two agents example, which is fully autonomous, the result seems to be very good. However, for the sequential agentic AI pattern that involves human input, there are some challenges if we want to use it for real-world applications.
It is obvious that this pattern cannot be used in real-world burger ordering since human beings are inherently unpredictable and capable of making complex decisions based on a multitude of factors. For challenges 1 & 2, we need a strategy to make the agent check and reask for relevant information.
Can LLMs with proper prompting and framework alone solve all real-world scenarios involving nested dialogs, branching, and looping? Or do we need to mix LLMs with other strategies? Stay tuned for our future exploration.
"XTOPIA helps Malaysian businesses navigate AI adoption —from strategy to execution. Whether you’re just beginning your AI journey or ready to scale with agent-based automation, we provide tailored solutions grounded in technology, trust, and transformation. XTOPIA is owned and developed by XIMNET."