Assessing the application portfolio that needs to be moved to the cloud can be time-consuming. Despite the existence of AWS Application Discovery Service and some form of configuration management database (CMDB), customers still face many challenges. This includes follow-up discussions with application teams to review output and understand dependencies (approximately 2 hours per application) to produce a cloud architecture design that meets security and compliance requirements. Includes the cycles required and the effort required to provide a cost estimate. Choose the right AWS services and configurations to optimize application performance in the cloud. These tasks typically take 6 to 8 weeks to perform before the actual application migration begins.
In this blog post, we leverage the power of generative AI and Amazon Bedrock to help organizations simplify, accelerate, and scale their migration assessments. Amazon Bedrock is a fully managed service that provides a selection of high-performance foundational models (FM) from leading AI companies, including AI21 Labs, Anthropic, Cohere, Meta, Stability AI, and Amazon, through a single API. The capabilities you need to build generative AI applications with security, privacy, and responsible AI. Demonstrates how to use Amazon Bedrock Agent, action groups, and the Amazon Bedrock Knowledge Base to build a migration assistant application that quickly generates migration plans, R-disposition, and cost estimates for applications you want to migrate to AWS. This approach allows you to extend application portfolio discovery and significantly accelerate the planning phase.
Migration Assistant general requirements
Here are some important requirements to consider when building your migration assistant:
accuracy and consistency
Can the Migration Assistant application render accurate and consistent responses?
guidance: To ensure accurate and consistent responses from the Migration Assistant, implement the Amazon Bedrock knowledge base. The knowledge base should contain contextual information based on your company’s private data sources. This enables the migration assistant to use retrieval augmentation generation (RAG), which improves response accuracy and consistency. A knowledge base must contain multiple data sources, such as:
deal with hallucinations
How do you mitigate large language model (LLM) hallucinations in migration assistant applications?
guidance: Mitigating LLM hallucinations requires the implementation of several key strategies. Implement customized prompts based on your requirements and incorporate advanced prompting techniques to guide model inference and provide more accurate example responses. These techniques include thought chain prompts, zero-shot prompts, multi-shot prompts, few-shot prompts, and model-specific prompt engineering guidelines (see Amazon Bedrock Prompt Engineering Guidelines for Anthropic Claude). RAG combines information retrieval and generation capabilities to increase contextual relevance and reduce hallucinations. Finally, a feedback loop or human participation when fine-tuning the LLM on a given dataset can help tailor the response to accurate and relevant information and reduce errors and outdated content.
modular design
Is the Migration Assistant modular in design?
guidance: Building your Migration Assistant application using the modular design of Amazon Bedrock action groups provides three important benefits.
- Customization and adaptability: Action groups allow users to customize migration workflows for specific AWS environments and requirements. For example, when a user migrates a web application to AWS, the migration workflow can be customized to include specific actions for web server setup, database migration, and network configuration. This customization ensures that the migration process is tailored to the specific needs of the applications being migrated.
- Maintenance and troubleshooting: Simplify maintenance and troubleshooting tasks by isolating problems into individual components. For example, if there is an issue with a database migration action within a migration workflow, it can be addressed independently without affecting other components. This separation streamlines the troubleshooting process and minimizes the impact on the overall migration operation, ensuring smoother migrations and faster issue resolution.
- Scalability and reusability: Facilitates scalability and reusability across different AWS migration projects. For example, if you successfully migrate an application to AWS using a set of modular action groups, you can reuse those same action groups to migrate other applications with similar requirements. This reusability saves time and effort when developing new migration workflows and ensures consistency across multiple migration projects. Additionally, the modular design facilitates scalability by allowing users to scale up or down migration operations based on workload demands. For example, if you need to migrate a large application with higher resource requirements, you can easily scale up your migration workflow by adding instances of related action groups without having to redesign the entire workflow from scratch. can.
Solution overview
Before getting into the implementation details, let’s take a look at the major steps in the architecture that will be established, as shown in Figure 1.
- Users interact with the migration assistant through the Amazon Bedrock chat console and enter their requests. For example, a user may request: Generate R processing using cost estimates or Generate a migration plan For a specific application ID (for example, A1-CRM or A2-CMDB).
- The Migration Assistant, which uses the Amazon Bedrock agent, consists of steps, action groups, and a knowledge base. When processing a user’s request, the Migration Assistant calls a group of related actions, including: Properties of R and Migration planthen calls a specific AWS Lambda
- The Lambda function uses the RAG to process the request and produce the required output.
- The resulting output document (R-Disposal and cost estimation and Migration plan) is uploaded to the specified Amazon Simple Storage Service (Amazon S3).
The following image is a screenshot of a sample user interaction with the migration assistant.
Prerequisites
You will need:
Installation steps
- Configure your knowledge base.
- Open the AWS Management Console for Amazon Bedrock and navigate to the following location. Amazon Bedrock Knowledge Base.
- choose Create a knowledge base Click and enter a name and optional description.
- Select a vector database (such as Amazon OpenSearch Serverless).
- Select an embedding model (e.g. Amazon Titan Embedding G1 – Text).
- Add a data source.
- For Amazon S3: Specify the S3 bucket and prefix, file type, and chunking configuration.
- For custom data: Ingest data programmatically using the API.
- Review and create a knowledge base.
- Set up the Amazon Bedrock agent.
- In the Amazon Bedrock console, agent Select the section and select Create Agent.
- Enter a name and optional description for the agent.
- Select a base model (for example, Anthropic Claude V3).
- Configure the agent’s AWS Identity and Access Management (IAM) role to grant the necessary permissions.
- Add instructions to guide the agent’s behavior.
- Optionally, add the Amazon Bedrock knowledge base you previously created to enhance the agent’s response.
- Configure additional settings such as maximum tokens and temperature.
- Review and create an agent.
- Configure agent action groups.
- On the agent configuration page, action group
- Select (Add action group) for each required group (for example, (Create R-nature assessment) or (Create migration plan)).
- For each action group:
- After adding all action groups, review the entire agent configuration and deploy the agent.
cleaning
To avoid unnecessary charges, delete the resources created during testing. Use the following steps to clean up your resources.
- Delete the Amazon Bedrock knowledge base. Open the Amazon Bedrock console.
Removes a knowledge base from its associated agents.- In the left navigation pane, agent.
- Select. name The agent whose knowledge base you want to delete.
- A red banner appears warning you to remove references to the non-existent knowledge base from your agent.
- Select the radio button next to the knowledge base you want to delete. choose more and select erase.
- In the left navigation pane, knowledge base.
- To remove a source, select the radio button next to the source and erase or name After selecting the source erase It’s located in the top right corner of the details page.
- Review the warnings about deleting the knowledge base. If you accept these conditions, type “delete” in the input box and erase To confirm.
- Delete an agent
- In the Amazon Bedrock console, agent From the left navigation pane.
- Select the radio button next to the agent you want to remove.
- A modal will appear warning you about the consequences of the deletion. Type “delete” in the input box and select erase To confirm.
- A blue banner will appear informing you that the agent has been removed. Once the removal is complete, a green success banner will appear.
- remove all other resources This includes Lambda functions and AWS services used to customize your account.
conclusion
Performing an application portfolio assessment for AWS Cloud migration is a time-consuming process that involves analyzing data from various sources, discussing discovery and design to develop an AWS Cloud architecture design, and estimating costs. It may happen.
In this blog post, you learned how to use generative AI and Amazon Bedrock to simplify, accelerate, and scale migration assessments. We showed you how to use Amazon Bedrock Agent, action groups, and the Amazon Bedrock Knowledge Base in a migration assistant application that displays migration planning, R deployment, and cost estimates. This approach significantly reduces the time and effort required for portfolio assessment, enabling organizations to scale and accelerate their migration to the AWS Cloud.
Ready to improve your cloud migration process with Amazon Bedrock Generated AI? Get started with the Amazon Bedrock User Guide to understand how you can streamline your organization’s migration to the cloud . For additional support and expertise, consider using AWS Professional Services (contact your sales representative). This allows you to streamline your cloud migration journey and take full advantage of Amazon Bedrock.
About the author
Ebby Thomas He is a senior cloud architect at AWS, focused on leveraging generative AI to enhance cloud infrastructure automation and accelerate migration. In his role with AWS Professional Services, Ebbey designs and implements solutions that improve the speed and efficiency of cloud adoption while ensuring secure and scalable operations for AWS users. He is known for solving complex cloud challenges and delivering tangible results for his clients. Ebbey holds a bachelor’s degree in computer engineering and a master’s degree in information systems from Syracuse University.
Shiva Vaidyanathan I’m a Principal Cloud Architect at AWS. He provides technical guidance, design, and implementation project leadership to customers to ensure success on AWS. He is committed to making cloud networking easy for everyone. Prior to joining AWS, he worked on several NSF-funded research initiatives on performing secure computing on public cloud infrastructure. He holds a master’s degree in computer science from Rutgers University and a master’s degree in electrical engineering from New York University.