Overview of Amazon Bedrock with networking, security and observability considerations

  • 24 January 2024
  • 0 replies
  • 41 views

Userlevel 3
Badge +5
Amazon Bedrock is a managed service by AWS which simplifies consumption of Generative AI by making base models from Amazon and other 3rd party providers available via API. The account and infrastructure for Amazon Bedrock are specific to model provider and is hosted in their escrow accounts which are owned and managed by AWS.

 

Amazon Bedrock Backend Architecture

Amazon Bedrock offers Foundation Models with single or multi-tenancy options where
1- Customers can use the base model in a Multi-tenant setup using a read-only mode so no one can make changes to the model
2- Make a copy of the base model and fine-tune it based on their own customization in a Single-tenant setup which is kept in the Fine-tuned model S3 bucket.
3- Train the model using their own training Data. All the customization and training done are only accessible to the customer and is never fed back to the common base models.
  • In this model, you use Amazon SageMaker for Training Orchestration. An Aamzon SageMaker Training job is started in the model providers account
  • SageMaker then reaches out to the customer S3 bucket for the training data and finally the fine-tuned model is put in a s3 bucket in the model providers account.
  • The customer data is never placed in the model providers account.
  • Now this fine tuned model s3 bucket is all trained and can be used as a Single Tenant endpoint as well.
image.png

 

Amazon Bedrock Frontend Architecture

The Amazon Bedrock service is made accessible via a API Endpoint, just like any other Amazon service. This API endpoint can be reached by different means
a. From a VPC using NAT Gateway (or if Client-A has public IP, it can go directly over IGW)
b. Using AWS Private Link
c. From on-prem, using Direct Connect (via TGW) (or directly over internet)
 
image.png

 

Overall Bedrock Architecture

The following diagram pieces the frontend and backend architecture together
Amazon Bedrock-Overall-Architecture.png
 
Lets look at the right side of the above diagram which shows how SageMaker would get access to your S3 buckets especially if the bucket is not exposed to internet and is only accessible via a VPC and has bucket specific policies. In this case SageMaker will need to be configured so it drops an ENI in a subnet in customer VPC-C which is provisioned with a S3 PrivateLink with access to the bucket.
image.jpeg

Connectivity, Network Security and Observability Considerations

 

a. NAT Gateway:

  • Nat Gateway provides unrestricted access to internet.
  • NAT Gateway has its own per hour and per GB cost which can add up very quickly
  • When multiple VPCs need access to Amazon Bedrock, NAT Gateways can become very costly

 

b. AWS PrivateLink

Although AWS PrivateLink provides a secure path to Amazon Bedrock API endpoint, it poses following two challenges
i. Many customers deploy multiple applications in the same VPC but different subnets. If you have multiple applications in the same VPC where only select applications or subnets should have access to Amazon Bedrock, its not easy to limit that access. Security groups can be used but they are static and need to be updated as applications scale.
ii. When multiple VPCs need access to Amazon Bedrock, AWS PrivateLinks can become very costly

 

c. Prod vs Non-Prod Segmentation

When there is a centralized VPC with access to Amazon Bedrock (PrivateLink), its not easy to limit only specific VPCs to have access to it. As several organization choose to have separate training models for Production versus non-prod (such as test, dev), it becomes critical that the network access are segmented as well.

 

d. Network Observability

As traffic to Bedrock maybe coming from different VPCs, regions and on-prem locations, across different accounts, there is a need for a common multi-account dashboard that can consolidate all access paths and provide visual paths in addition to complete Netflow data. This also allows to investigate access history for potential exposure.

 

e. Multi-region, Hybrid and Multi-cloud access


0 replies

Be the first to reply!

Reply