In an enterprise environment, an organization divides AI operations into two specialized teams. It is the AI research team and the model hosting team. The research team is dedicated to developing and enhancing AI models using model training and fine-tuning techniques. Meanwhile, independent hosting teams are responsible for deploying these models in their own development, staging and production environments.
Importing Amazon Bedrock Custom Models allows hosting teams to import and serve custom models using supported architectures such as Metalama 2, Llamas 3, and Mistral. Teams can import models with weights when hugging the Face Safetensors format from Amazon Sagemaker or Amazon Simple Storage Service (Amazon S3). These imported custom models, along with existing Amazon Bedrock Foundation Models (FMS), work through a single unified API in a serverless way, reducing the need to manage model deployment and scaling I will.
However, in such an enterprise environment, these teams often work on separate AWSs, explaining security and operational reasons. The training results of a model development team called model artifacts, called model weights, are usually stored in an S3 bucket within the research team’s AWS account, but the hosting team accesses these artifacts from another account and deploy the model. This creates challenges. How can I securely share model artifacts between accounts?
This is where cross-account access is important. With Amazon Bedrock Custom Model Import cross-Account support, you can help you configure direct access between S3 bucket retention model artifacts and hosting accounts. This streamlines operational workflows while maintaining security perimeters between teams. One of our customers quotes:
Import cross-account support for Bedrock Custom Model helped AI platform teams simplify configuration and reduce operational overhead and secure models in their original locations.
– Scott Chang, Principal Engineer, Salesforce AI Platform
This guide provides step-by-step instructions for configuring cross-access access for importing Amazon Bedrock custom models, covering both unencrypted and AWS Key Management Services (AWS KMS)-based encryption scenarios.
Example scenario
For this walkthrough, consider two AWS accounts.
- Model Development Account (
111122223333
):- The store is a model artifact (custom weights and configuration) for S3 buckets called S3 buckets
model-artifacts-111122223333
- Optionally, encrypt artifacts using AWS KMS customer managed keys
kms-cmk-111122223333
- The store is a model artifact (custom weights and configuration) for S3 buckets called S3 buckets
- Model Hosting Account (
777788889999
):- Host your model using Amazon Bedrock Custom Model Import
- Use the new AWS Identity and Access Management (IAM) Execution Role
BedrockCMIExecutionRole-777788889999
- You can optionally encrypt artifacts using AWS KMS keys
kms-cmk-777788889999
The following diagram illustrates this setup and shows how cross-account access is configured between the S3 bucket, KMS key, and Amazon Bedrock Custom Model imports.
To successfully implement the scenario described while adhering to the principles of least privileged access, you must perform the following steps:
- The model development account must provide access to the IAM role of the model hosting account
BedrockCMIExecutionRole-777788889999
you can use S3 buckets using resource-based policies and use encryption keys if applicable. - The model hosting account is
BedrockCMIExecutionRole-777788889999
. The required identity-based policies are for the Model Development S3 bucket and customer management keys.kms-cmk-111122223333
. - Model Hosting Accounts should allow Amazon Bedrock services to assume the role of IAM
BedrockCMIExecutionRole-777788889999
was created in step 2 by including Amazon bedrock services as a trusted entity. This IAM role is utilized by the model hosting account to initiate the import job of a custom model.
Prerequisites
The following prerequisites must be met before you can start the import custom model job:
- If you are importing models from an S3 bucket, prepare the model file in Face Weights format to hold it in. See Import Sources for more information.
- (Optional) Sets additional security configurations.
Step-by-step execution
The next section presents the step-by-step execution of the high-level processes outlined previously from the perspective of the administrator who manages both accounts.
Step 1: Set up an S3 bucket policy (model development account) to allow access to the IAM roles of the model hosting account.
- Sign in to your account’s AWS Management Console
111122223333
access the Amazon S3 console. - In General purpose Bucket view, find a location
model-artifacts-111122223333
the bucket used by the model development team to store model artifacts. - In authority Tab, select edit in Bucket Policy Insert the following IAM resource-based policy: Be sure to update your AWS account ID (it will be displayed in red) With your information and policy.
Step 2: Establish an IAM role (model hosting account) and allow Amazon Bedrock to assume this role.
- Sign in to your account’s AWS console
777788889999
Start the IAM console. - In the left navigation pane, select policy And then choose Create a policy. internal Policy EditorSwitch to JSON Insert the tab and the following ID-based policies: This policy is designed for read-only access and allows objects to be listed and downloaded from users or specified S3 buckets, but only if the bucket is owned by the account
111122223333
. Customize your AWS Account ID and S3 Bucket Name/Prefix (see below) red) With your information.
- choose Nextassign the policy name as
BedrockCMIExecutionPolicy-777788889999
select and finalize Create a policy. - In the left navigation pane, select role Select Custom Trust Policy As Trusted Entity Types. Insert the following trusted entity policy that limits role assumptions for Amazon bedrock services, especially for model import jobs in your account:
777788889999
Located in the eastern US (N. Virginia)us-east-1
region. Change your AWS Account ID and Region (see below) red) With your information.
- choose Next And Add permissions Search for the policy created in the section, previous step
BedrockCMIExecutionPolicy-777788889999
select the checkbox and select to continue Next. - Assign Role Name As
BedrockCMIExecutionRole-777788889999
provides a explanation Select and finalize the “IAM execution role used by CMI jobs.” Create a role.
important: If you are using AWS KMS encryption keys for model artifacts in your model development account, or imported model artifacts using your Amazon Bedrock Managed AWS account, continue Step 3 Through 5. If not, skip Step 6.
Step 3: Adjust the AWS KMS key policy (model development account) to enable the Amazon Bedrock CMI running IAM role to decrypt model artifacts.
- Return to your model development account and find the name AWS KMS Key
kms-cmk-111122223333
AWS KMS Console. Note the AWS KMS Key Amazon Resource Name (ARN). - In Important policies Switch to tabs Policy Viewand incorporate the following resource-based policy statements to enable IAM roles for model hosting accounts
BedrockCMIExecutionRole-777788889999
Decode the model artifact. Revise the item red With your information.
Step 4: Set the AWS KMS key policy (model hosting account) to the CMI running IAM role to encrypt and decrypt model artifacts and store them securely in your Amazon Bedrock AWS account.
- Return to your model hosting account and find the AWS KMS key with your name
kms-cmk-777788889999
AWS KMS Console. Beware of AWS KMS keyarns. - Insert the following statement into the resource-based policy for AWS KMS keys,
BedrockCMIExecutionRole-777788889999
The role of IAM to encrypt and decrypt Model Artifact at rest in your Amazon Bedrock Managed AWS account. Revise the item red With your information.
Step 5: Change the permissions (model hosting account) for the CMI Run role to provide access to the encryption key.
Go to the IAM console and find the IAM policy BedrockCMIExecutionPolicy-777788889999
. Add the following statement to your existing identity-based policy (replace ARNS with red It is attracting attention to Step 4 and 5):
Step 6: Start the model import job (with your model hosting account)
In this step, you will run the model import job using AWS Command Line Interface (AWS CLI) commands. You can also use the AWS SDK or API for the same purpose. From a terminal session, run the following command using the IAM user or role that has the privileges required to create a custom model import job: There is no need to explicitly provide details about the ARN or CMK used by the model development team.
When encrypting model artifacts using Amazon Bedrock custom model import, --imported-model-kms-key-id
Specify the ARN of the CMK key for the model hosting account as a flag.
Cross-account access to S3 buckets using the import custom model job is supported only through the AWS CLI, AWS SDK, or API. Console support is not available yet.
troubleshooting
If misconceptions of IAM policies prevent custom model import jobs, you can get errors like the following:
To resolve this, manually validate access to the S3 bucket of model development from the model hosting account BedrockCMIExecutionRole-777788889999
. Follow these steps:
Step 1: Identify the current IAM role or user in the CLI as follows and copy the ARN from the output:
Step 2: Update your trust. Add a trust policy for BedrockCMIExecutionRole-777788889999
To allow the current user or IAM role to assume this role:
Step 3: List or copy the contents of an S3 bucket that is expected to be the Amazon Bedrock Custom Model Import Execution role
- Assume the role of running the CMI (replaces the ARN with information):
- Exports the returned temporary credentials as environment variables.
- Run the command to troubleshoot authorization issues.
If the error persists, consider using an Amazon Q developer or refer to the additional resources outlined in the IAM User Guide.
cleaning
There is no additional charge for importing custom models into Amazon Bedrock (see Step 6 in Step-by-step execution section). However, if the model is not used for inference, and if you want to avoid paying storage costs (see Amazon Bedrock Pricing), use the AWS console or AWS CLI reference or API reference to delete the import model. For example, replace the text with red (by imported model name):
Conclusion
By using cross-ascent access in Amazon Bedrock Custom Model imports, organizations can streamline their AI model deployment workflows significantly.
Amazon Bedrock Custom Model imports are generally available at Amazon Bedrock (N. Virginia) in the US East us-east-1
And the US West (Oregon) us-west-2
AWS Region. See the full region list for future updates. For more information, see the Amazon Bedrock Custom Model import product page and the Amazon Bedrock Pricing page. Offer Amazon Bedrock Custom Model and try it out now on the Amazon Bedrock console and send feedback to AWS Re:Amazon Bedrock posts or regular AWS support contacts.
We would like to thank contributors Scott Chang (Salesforce), Raghav Tanaji (Salesforce), Rupinder Grewal (AWS), Ishan Singh (AWS), and Dharinee Gupta (AWS).
About the author
Hrushikesh Gangur He is a leading solution architect at AWS. Based in San Francisco, California, Hrushikesh is an AWS machine learning expert. As a thought leader in the field of generator AI, Hrushikesh has contributed to efforts to help AWS startups and ISVs build and deploy AI applications. His expertise ranges from a wide range of AWS services, including Amazon Sagemaker, Amazon Bedrock, and accelerated computing, which is important for building AI applications.
Sai Daraha Acneni I am an AWS software development engineer. He received a Master’s degree in Computer Engineering from Cornell University and worked at an Autonomous Systems Lab, specializing in computer vision and robotic recognition. Now he helps deploy large-scale language models to optimize throughput and latency.
Prashant Patel I am a senior software development engineer at AWS. He is passionate about scaling large-scale language models for enterprise applications. Before joining AWS, he worked at IBM to produce large-scale AI/ML workloads with Kubernetes. Prashant holds a Master’s degree from the NYU Tandon School of Engineering. While not working, he enjoys traveling and playing with his dogs.