Amazon Bedrock Flows provides an intuitive visual builder and set of APIs to seamlessly link foundational models (FMs), Amazon Bedrock features, and AWS services to build and automate user-defined generative AI workflows at scale. I will. Amazon Bedrock Agents provides a fully managed solution for creating, deploying, and scaling AI agents on AWS. Flows allow you to provide explicitly written user-defined decision-making logic to run a workflow, or add agents as nodes in a flow and use FM to provide context for specific steps in a workflow. You can dynamically interpret and execute tasks based on inference.
Today, we’re excited to announce a powerful new feature in Flows: multi-turn conversations with agent nodes (preview). This new feature enhances the capabilities of agent nodes to enable dynamic conversations between users and flows, similar to the natural interactions in flow execution.
This new feature allows the agent node to intelligently pause flow execution and request user-specific information when it needs clarification or additional context from the user before continuing. After the user submits the requested information, the flow seamlessly resumes execution with the enriched input and executionId
The state of the conversation.
This allows the node to adapt its behavior based on the user’s response, creating a more interactive and context-aware experience. The following sequence diagram shows the steps in the flow.
Multi-turn conversations allow developers to easily create agent workflows that can dynamically adapt and reason. This is especially useful in complex scenarios where a single interaction is not sufficient to fully understand and address the user’s needs.
This post describes how to create multi-turn conversations and explores how this feature can transform your AI applications.
Solution overview
Consider ACME Corp, a fictitious large online travel agency that uses Flows to develop an AI-powered vacation itinerary planner. Implementation faces several challenges, including:
- Planners can’t participate in dynamic conversations and have to communicate all the details of their trip upfront instead of asking follow-up questions.
- Faced with the challenge of coordinating a complex, multi-step travel planning process that requires coordinating flights, accommodations, activities, and transportation across multiple destinations, inefficiencies and This leads to a sub-optimal customer experience.
- If users change their preferences or introduce new constraints during the planning process, the application cannot dynamically adapt its recommendations.
How Flows’ new multi-turn conversation capabilities address these challenges and enable ACME Corp to build a more intelligent, context-aware, and efficient vacation travel planner that truly improves the travel planning experience for its customers Let’s see if it does.
Flows offer two different interaction paths. For general travel inquiries, users rely on LLM to receive instant responses. However, when a user wants to search or book a flight or hotel, they are connected to an agent who guides them through the process and collects important information while maintaining the session until completion. The workflow is shown in the following diagram.
Prerequisites
This example requires:
- A user with an AWS account and an AWS Identity and Access Management (IAM) role authorized to use Bedrock. For guidance, see Getting Started with Amazon Bedrock. Give your role permissions to use flows, as described in Prerequisites for Amazon Bedrock Flows, and create an agent, as described in Prerequisites for Creating an Amazon Bedrock Agent. Make sure your role includes permissions to use it.
- Provides access to the model for invocation and evaluation. For guidance, see Managing Access to Amazon Bedrock Foundation Models.
- Create Amazon Bedrock Agent to automate tasks in your travel agency application by coordinating interactions between FM, API calls, and user conversations. Our travel agency offers four essential booking functions: find available flights, secure flight reservations, find suitable hotel accommodations, and complete hotel reservations. For an example of how to create a travel agency, see Agents in Amazon Bedrock now support memory persistence and code interpretation (preview). Make sure the user input feature is enabled on your agent. This setting allows agents to gather all the necessary details through a natural conversation, even if the initial request is incomplete.
Create multi-turn conversation flows
To create a multi-turn conversation flow, follow these steps:
- In the Bedrock console, flow under builder tools in the navigation pane.
- Start creating a new flow called .
ACME-Corp-trip-planner
.
For detailed instructions on creating flows, see Amazon Bedrock Flows now generally available with enhanced security and traceability.
Bedrock provides various node types for building prompted flows.
- Select the prompt node to evaluate the intent of the input. Classify your intent as follows:
categoryLetter=A
If a user wants to search or book a hotel or flight,categoryLetter=B
When a user is seeking destination information. If you use Amazon Bedrock prompt management, you can select your prompts from there.
This node uses the following messages in its prompt configuration:
In this example, we chose Amazon’s Nova Lite model and set the temperature inference parameter to 0.1 to minimize hallucinations and increase the reliability of the output. You can choose from other available Amazon Bedrock models.
- Create a Condition node using the following information and connect it to the Query Classifier node. The condition values ​​for this node are:
- Create a second prompt node for the LLM guided call. The node’s input is the output of the Condition node output “If all conditions are false.” To finish this flow branch, add a flow output node and connect the prompt node output to it.
In this example, we chose Amazon’s Nova Lite model and set the temperature inference parameter to 0.1 to minimize hallucinations and increase the reliability of the output.
- Finally, create an agent node and configure it to use the agent you created earlier. The input of the node is the output of the Condition node output “Conditions Booking”. To finish this flow branch, add a flow output node and connect the output of the agent node to it.
- choose keep Save the flow.
Test the flow
You are now ready to test your flow through the Amazon Bedrock console or API. First, let’s listen to some information about Paris. You can see the flow trace in the response, giving you a detailed view of the execution process. These traces can help you monitor and debug the response time of each step, track the processing of customer input, ensure that guardrails are properly applied, and identify bottlenecks in your system. Flow tracing provides a comprehensive overview of the entire response generation process, enabling more efficient troubleshooting and performance optimization.
Then continue the conversation and request to book a trip to Paris. As you can see, Flows’ multi-turn support allows agent nodes to ask follow-up questions, collect all information, and make reservations.
We continue the conversation with the agent, provide all the necessary information, and eventually the agent schedules the appointment. In the trace, ExecutionId
Maintains sessions for multi-turn requests.
After confirmation, the agent successfully completed the user’s request.
Use the Amazon Bedrock Flows API
You can also manipulate flows programmatically using the InvokeFlow API, as shown in the following code. During the first call, the system creates a unique executionId
keep the session for 1 hour. this executionId
essential for subsequent InvokeFlow
API call. This is to maintain conversation history and provide agents with the necessary contextual information to complete an action.
If an agent node in the flow determines that additional information is required from the user, it sends a response stream (responseStream
) from InvokeFlow
contains FlowMultiTurnInputRequestEvent
event object. The content of the event contains the requested information (FlowMultiTurnInputContent
) field.
Below is an example FlowMultiTurnInputRequestEvent
JSON object:
The flow cannot continue until it receives more input, so the flow FlowCompletionEvent
event. flow is always FlowMultiTurnInputRequestEvent
in front of FlowCompletionEvent
. The value of completionReason
in FlowCompletionEvent
The event is INPUT_REQUIRED
more information is required to continue the flow.
Below is an example FlowCompletionEvent
JSON object:
to send the user response back to the flow. InvokeFlow
API again. Please be sure to include it. executionId
For conversation.
The following is InvokeFlow
APIs that provide additional information required by agent nodes:
This interaction continues until no more information is needed and the agent has all the information it needs to complete the user’s request. If no more information is needed, the flow is FlowOutputEvent
event. This includes the final response.
Below is an example FlowOutputEvent
JSON object:
The flow is also FlowCompletionEvent
event. value of completionReason
teeth SUCCESS
.
Below is an example FlowCompletionEvent
JSON object:
Use the following sample code to start a multiturn call. Handle subsequent interactions using the same executionId
Maintain context throughout the conversation. You must specify the ID of the flow FLOW_ID
and its alias ID FLOW_ALIAS_ID
(For instructions on obtaining these IDs, see Viewing Information About Flows in Amazon Bedrock.)
The system, if necessary, executionId
It maintains context across multiple interactions and provides a consistent and continuous conversation flow while performing requested actions.
cleaning
To clean up your resources, delete the flow, agent, AWS Lambda function created for the agent, and knowledge base.
conclusion
The introduction of multi-turn conversation capabilities in Flows represents a major advance in building advanced conversational AI applications. In this post, we explained how developers can use this feature to create dynamic, context-aware workflows that can handle complex interactions while preserving conversation history and state. The Flows visual builder interface, combined with an API with powerful agent capabilities, makes it easy to develop and deploy intelligent applications that can have natural, multi-step conversations.
This new capability allows businesses to build more intuitive and responsive AI solutions that meet customer needs. Whether you’re developing a travel booking system, customer service, or other conversational application, Multi-Turn Conversations with Flows is the tool you need to create sophisticated AI workflows with minimal complexity. We provide
We encourage you to explore these features in the Bedrock console and start building your own multi-turn conversational applications today. For more information and detailed documentation, see the Amazon Bedrock User Guide. We look forward to seeing the innovative solutions you create using these powerful new features.
About the author
Christian Kamwangala is an AWS AI/ML and Generative AI Specialist Solutions Architect based in Paris, France. He helps enterprise customers design and implement cutting-edge AI solutions using a comprehensive suite of AWS tools, with a focus on production-ready systems that follow industry best practices. In his free time, Christian enjoys exploring nature and spending time with family and friends.
Irene Arroyo Delgado I am an AI/ML and GenAI Specialist Solutions Architect at AWS. She focuses on automating the end-to-end ML lifecycle to unlock the potential of generative AI for each use case and bringing ML workloads into production to achieve the business outcomes customers want. I’m leaving it there. In her free time, Irene enjoys traveling and hiking.