Skip to Content

AWS Certified Solutions Architect – Associate SAA-C03 Exam Questions and Answers – Page 2 Part 1

The latest AWS Certified Solutions Architect – Associate SAA-C03 certification actual real practice exam question and answer (Q&A) dumps are available free, which are helpful for you to pass the AWS Certified Solutions Architect – Associate SAA-C03 exam and earn AWS Certified Solutions Architect – Associate SAA-C03 certification.

Exam Question 131

A company has a 10 Gbps AWS Direct Connect connection from its on-premises servers to AWS. The workloads using the connection are critical. The company requires a disaster recovery strategy with maximum resiliency that maintains the current connection bandwidth at a minimum.

What should a solutions architect recommend?

A. Set up a new Direct Connect connection in another AWS Region.
B. Set up a new AWS managed VPN connection in another AWS Region.
C. Set up two new Direct Connect connections: one in the current AWS Region and one in another Region.
D. Set up two new AWS managed VPN connections: one in the current AWS Region and one in another Region.

Correct Answer

C. Set up two new Direct Connect connections: one in the current AWS Region and one in another Region.

Exam Question 132

As part of budget planning, management wants a report of AWS billed items listed by user. The data will be used to create department budgets. A solutions architect needs to determine the most efficient way to obtain this report information.

Which solution meets these requirements?

A. Run a query with Amazon Athena to generate the report.
B. Create a report in Cost Explorer and download the report.
C. Access the bill details from the billing dashboard and download the bill.
D. Modify a cost budget in AWS Budgets to alert with Amazon Simple Email Service (Amazon SES).

Correct Answer

B. Create a report in Cost Explorer and download the report.

Exam Question 133

A company with facilities in North America, Europe, and Asia is designing new distributed application to optimize its global supply chain and manufacturing process. The orders booked on one continent should be visible to all Regions in a second or less. The database should be able to support failover with a short Recovery Time Objective (RTO). The uptime of the application is important to ensure that manufacturing is not impacted.

What should a solutions architect recommend?

A. Use Amazon DynamoDB global tables.
B. Use Amazon Aurora Global Database.
C. Use Amazon RDS for MySQL with a cross-Region read replica.
D. Use Amazon RDS for PostgreSQL with a cross-Region read replica.

Correct Answer

B. Use Amazon Aurora Global Database.

Answer Description

Cross-Region Disaster Recovery: If your primary region suffers a performance degradation or outage, you can promote one of the secondary regions to take read/write responsibilities. An Aurora cluster can recover in less than 1 minute even in the event of a complete regional outage. This provides your application with an effective Recovery Point Objective (RPO) of 1 second and a Recovery Time Objective (RTO) of less than 1 minute, providing a strong foundation for a global business continuity plan.

Exam Question 134

A company’s near-real-time streaming application is running on AWS. As the data is ingested, a job runs on the data and takes 30 minutes to complete. The workload frequently experiences high latency due to large amounts of incoming data. A solutions architect needs to design a scalable and serverless solution to enhance performance.

Which combination of steps should the solutions architect take? (Choose two.)

A. Use Amazon Kinesis Data Firehose to ingest the data.
B. Use AWS Lambda with AWS Step Functions to process the data.
C. Use AWS Database Migration Service (AWS DMS) to ingest the data.
D. Use Amazon EC2 instances in an Auto Scaling group to process the data.
E. Use AWS Fargate with Amazon Elastic Container Service (Amazon ECS) to process the data.

Correct Answer

A. Use Amazon Kinesis Data Firehose to ingest the data.
E. Use AWS Fargate with Amazon Elastic Container Service (Amazon ECS) to process the data.

Exam Question 135

An application running on an Amazon EC2 instance needs to access an Amazon DynamoDB table. Both the EC2 instance and the DynamoDB table are in the same AWS account. A solutions architect must configure the necessary permissions.

Which solution will allow least privilege access to the DynamoDB table from the EC2 instance?

A. Create an IAM role with the appropriate policy to allow access to the DynamoDB table. Create an instance profile to assign this IAM role to the EC2 instance.
B. Create an IAM role with the appropriate policy to allow access to the DynamoDB table. Add the EC2 instance to the trust relationship policy document to allow it to assume the role.
C. Create an IAM user with the appropriate policy to allow access to the DynamoDB table. Store the credentials in an Amazon S3 bucket and read them from within the application code directly.
D. Create an IAM user with the appropriate policy to allow access to the DynamoDB table. Ensure that the application stores the IAM credentials securely on local storage and uses them to make the DynamoDB calls.

Correct Answer

A. Create an IAM role with the appropriate policy to allow access to the DynamoDB table. Create an instance profile to assign this IAM role to the EC2 instance.

Exam Question 136

A solutions architect is designing a solution that involves orchestrating a series of Amazon Elastic Container Service (Amazon ECS) task types running on Amazon EC2 instances that are part of an ECS cluster. The output and state data for all tasks needs to be stored. The amount of data output by each task is approximately 10 MB, and there could be hundreds of tasks running at a time. The system should be optimized for high-frequency reading and writing. As old outputs are archived and deleted, the storage size is not expected to exceed 1 TB.

Which storage solution should the solutions architect recommend?

A. An Amazon DynamoDB table accessible by all ECS cluster instances.
B. An Amazon Elastic File System (Amazon EFS) with Provisioned Throughput mode.
C. An Amazon Elastic File System (Amazon EFS) file system with Bursting Throughput mode.
D. An Amazon Elastic Block Store (Amazon EBS) volume mounted to the ECS cluster instances.

Correct Answer

C. An Amazon Elastic File System (Amazon EFS) file system with Bursting Throughput mode.

Exam Question 137

A company wants to migrate its MySQL database from on premises to AWS. The company recently experienced a database outage that significantly impacted the business. To ensure this does not happen again, the company wants a reliable database solution on AWS that minimizes data loss and stores every transaction on at least two nodes.

Which solution meets these requirements?

A. Create an Amazon RDS DB instance with synchronous replication to three nodes in three Availability Zones.
B. Create an Amazon RDS MySQL DB instance with Multi-AZ functionality enabled to synchronously replicate the data.
C. Create an Amazon RDS MySQL DB instance and then create a read replica in a separate AWS Region that synchronously replicates the data.
D. Create an Amazon EC2 instance with a MySQL engine installed that triggers an AWS Lambda function to synchronously replicate the data to an Amazon RDS MySQL DB instance.

Correct Answer

A. Create an Amazon RDS DB instance with synchronous replication to three nodes in three Availability Zones.

Exam Question 138

A company stores user data in AWS. The data is used continuously with peak usage during business hours. Access patterns vary, with some data not being used for months at a time. A solutions architect must choose a cost-effective solution that maintains the highest level of durability while maintaining high availability.

Which storage solution meets these requirements?

A. Amazon S3 Standard
B. Amazon S3 Intelligent-Tiering
C. Amazon S3 Glacier Deep Archive
D. Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)

Correct Answer

B. Amazon S3 Intelligent-Tiering

Exam Question 139

A company receives inconsistent service from its data center provider because the company is headquartered in an area affected by natural disasters. The company is not ready to fully migrate to the AWS Cloud, but it wants a failure environment on AWS in case the on-premises data center fails.

The company runs web servers that connect to external vendors. The data available on AWS and on-premises must be uniform.

Which solution should a solutions architect recommend that has the LEAST amount of downtime?

A. Configure an Amazon Route 53 failover record. Run application servers on Amazon EC2 instances behind an Application Load Balancer in an Auto Scaling group. Set up AWS Storage Gateway with stored volumes to back up data to Amazon S3.
B. Configure an Amazon Route 53 failover record. Execute an AWS CloudFormation template from a script to create Amazon EC2 instances behind an Application Load Balancer. Set up AWS Storage Gateway with stored volumes to back up data to Amazon S3.
C. Configure an Amazon Route 53 failover record. Set up an AWS Direct Connect connection between a VPC and the data center. Run application servers on Amazon EC2 in an Auto Scaling group. Run an AWS Lambda function to execute an AWS CloudFormation template to create an Application Load Balancer.
D. Configure an Amazon Route 53 failover record. Run an AWS Lambda function to execute an AWS CloudFormation template to launch two Amazon EC2 instances. Set up AWS Storage Gateway with stored volumes to back up data to Amazon S3. Set up an AWS Direct Connect connection between a VPC and the data center.

Correct Answer

A. Configure an Amazon Route 53 failover record. Run application servers on Amazon EC2 instances behind an Application Load Balancer in an Auto Scaling group. Set up AWS Storage Gateway with stored volumes to back up data to Amazon S3.

Exam Question 140

What should a solutions architect do to ensure that all objects uploaded to an Amazon S3 bucket are encrypted?

A. Update the bucket policy to deny if the PutObject does not have an s3:x-amz-acl header set.
B. Update the bucket policy to deny if the PutObject does not have an s3:x-amz-acl header set to private.
C. Update the bucket policy to deny if the PutObject does not have an aws:SecureTransport header set to true.
D. Update the bucket policy to deny if the PutObject does not have an x-amz-server-side-encryption header set.

Correct Answer

D. Update the bucket policy to deny if the PutObject does not have an x-amz-server-side-encryption header set.