The latest AWS Certified Solutions Architect – Associate SAA-C03 certification actual real practice exam question and answer (Q&A) dumps are available free, which are helpful for you to pass the AWS Certified Solutions Architect – Associate SAA-C03 exam and earn AWS Certified Solutions Architect – Associate SAA-C03 certification.
Table of Contents
- Question 1231
- Exam Question
- Correct Answer
- Explanation
- Question 1232
- Exam Question
- Correct Answer
- Explanation
- Question 1233
- Exam Question
- Correct Answer
- Explanation
- Question 1234
- Exam Question
- Correct Answer
- Explanation
- Question 1235
- Exam Question
- Correct Answer
- Explanation
- Question 1236
- Exam Question
- Correct Answer
- Explanation
- Question 1237
- Exam Question
- Correct Answer
- Explanation
- Question 1238
- Exam Question
- Correct Answer
- Explanation
- Question 1239
- Exam Question
- Correct Answer
- Explanation
- Question 1240
- Exam Question
- Correct Answer
- Explanation
Question 1231
Exam Question
A solutions architect is creating a data processing job that runs once daily and can take up to 2 hours to complete. If the job is interrupted, it has to restart from the beginning.
How should the solutions architect address this issue in the MOST cost-effective manner?
A. Create a script that runs locally on an Amazon EC2 Reserved Instance that is triggered by a cron job.
B. Create an AWS Lambda function triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
C. Use an Amazon Elastic Container Service (Amazon ECS) Fargate task triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
D. Use an Amazon Elastic Container Service (Amazon ECS) task running on Amazon EC2 triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
Correct Answer
B. Create an AWS Lambda function triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
Explanation
To address the requirement of running a daily data processing job that can take up to 2 hours to complete and needs to restart from the beginning if interrupted, the most cost-effective solution would be:
B. Create an AWS Lambda function triggered by an Amazon EventBridge (Amazon CloudWatch Events) scheduled event.
AWS Lambda is a serverless compute service that allows you to run code without provisioning or managing servers. It is well-suited for running short-lived tasks and can be triggered by various event sources, including Amazon EventBridge (formerly known as Amazon CloudWatch Events) scheduled events.
By using AWS Lambda, you don’t need to manage any infrastructure or worry about scaling, which helps to keep the cost low. You can configure the scheduled event to trigger the Lambda function at the desired time interval, such as once daily.
The Lambda function can be implemented to execute the data processing job, and if the job is interrupted, it can start from the beginning as desired. AWS Lambda automatically takes care of scaling, so it can handle the job even if it takes up to 2 hours to complete.
Options A, C, and D involve using Amazon EC2 instances or Amazon ECS, which would require managing the underlying infrastructure and may incur additional costs compared to the serverless approach provided by AWS Lambda.
Therefore, option B is the most cost-effective solution for this scenario.
Question 1232
Exam Question
A user wants to list the IAM role that is attached to their Amazon EC2 instance. The user has login access to the EC2 instance but does not have IAM permissions.
What should a solutions architect do to retrieve this information?
A. Run the following EC2 command: curl http://169.254.169.254/latest/meta-data/iam/info
B. Run the following EC2 command: curl http://169.254.169.254/latest/user-data/iam/info
C. Run the following EC2 command: http://169.254.169.254/latest/dynamic/instance-identity/
D. Run the following AWS CLI command: aws iam get-instance-profile –instance-profile-name ExampleInstanceProfile
Correct Answer
A. Run the following EC2 command: curl http://169.254.169.254/latest/meta-data/iam/info
Explanation
To retrieve the IAM role that is attached to an Amazon EC2 instance when the user does not have IAM permissions, the following solution can be used:
A. Run the following EC2 command: curl http://169.254.169.254/latest/meta-data/iam/info
The EC2 instance metadata service provides a simple HTTP service that allows EC2 instances to access metadata about themselves and the services they are running on. The metadata is accessible through a local endpoint IP address: 169.254.169.254.
By running the specified command, curl http://169.254.169.254/latest/meta-data/iam/info, on the EC2 instance, the user can retrieve information about the IAM role attached to that instance.
Option B, curl http://169.254.169.254/latest/user-data/iam/info, is incorrect because the user data endpoint does not provide information about the IAM role.
Option C, http://169.254.169.254/latest/dynamic/instance-identity/, is incorrect because this endpoint provides information about the EC2 instance identity, not the IAM role.
Option D, aws iam get-instance-profile –instance-profile-name ExampleInstanceProfile, is incorrect because it requires IAM permissions to use the AWS CLI command aws iam get-instance-profile.
Therefore, option A is the correct solution to retrieve the IAM role information in this scenario.
Question 1233
Exam Question
A company is creating a three-tier web application consisting of a web server, an application server, and a database server. The application will track GPS coordinates of packages as they are being delivered. The application will update the database every 0–5 seconds. The tracking will need to be read as fast as possible for users to check the status of their packages. Only a few packages might be tracked on some days, whereas millions of packages might be tracked on other days. Tracking will need to be searchable by tracking ID customer ID and order ID. Order than 1 month no longer read to be tracked.
What should a solution architect recommend to accomplish this with minimal cost of ownership?
A. Use Amazon DynamoDB Enable Auto Scaling on the DynamoDB table. Schedule an automatic deletion script for items older than 1 month.
B. Use Amazon DynamoDB with global secondary indexes. Enable Auto Scaling on the DynamoDB table and the global secondary indexes. Enable TTL on the DynamoDB table.
C. Use an Amazon RDS On-Demand instance with Provisioned IOPS (PIOPS). Enable Amazon CloudWatch alarms to send notifications when PIOPS are exceeded. Increase and decrease PIOPS as needed.
D. Use an Amazon RDS Reserved Instance with Provisioned IOPS (PIOPS). Enable Amazon CloudWatch alarms to send notification when PIOPS are exceeded. Increase and decrease PIOPS as needed.
Correct Answer
A. Use Amazon DynamoDB Enable Auto Scaling on the DynamoDB table. Schedule an automatic deletion script for items older than 1 month.
Explanation
In this scenario, where the application needs to track GPS coordinates of packages in near real-time and provide fast access to the tracking information, while minimizing the cost of ownership, using Amazon DynamoDB with the following recommendations would be appropriate:
- Enable Auto Scaling on the DynamoDB table: This allows the table’s read and write capacity to automatically scale based on the incoming workload. It ensures that the application can handle fluctuations in tracking activity, whether it is only a few packages or millions of packages being tracked.
- Schedule an automatic deletion script for items older than 1 month: Since tracking information older than 1 month no longer needs to be tracked, a script can be scheduled to automatically delete or archive records that exceed this timeframe. This helps in managing the storage cost and ensures that the database remains optimized for performance.
Option B is incorrect because while it suggests using Amazon DynamoDB with global secondary indexes and enabling Auto Scaling, it does not mention the need for managing the data older than 1 month.
Option C and D, suggesting the use of Amazon RDS (Relational Database Service) with Provisioned IOPS and CloudWatch alarms for monitoring PIOPS (Provisioned IOPS), are not the best fit for the scenario. Amazon RDS is a managed relational database service and may not be the ideal choice for tracking GPS coordinates in near real-time. DynamoDB’s low-latency, highly scalable, and fully managed NoSQL database service is better suited for this type of workload.
Therefore, option A is the recommended approach, using Amazon DynamoDB with Auto Scaling and a scheduled deletion script for old tracking data.
Question 1234
Exam Question
A software vendor is deploying a new software-as-a-service (SaaS) solution that will be utilized by many AWS users. The service is hosted in a VPC behind a Network Load Balancer. The software vendor wants to provide access to this service to users with the least amount of administrative overhead and without exposing the service to the public internet.
What should a solutions architect do to accomplish this goal?
A. Create a peering VPC connection from each user’s VPC to the software vendor’s VPC.
B. Deploy a transit VPC in the software vendor’s AWS account. Create a VPN connection with each user account.
C. Connect the service in the VPC with an AWS Private Link endpoint. Have users subscribe to the endpoint.
D. Deploy a transit VPC in the software vendor’s AWS account. Create an AWS Direct Connect connection with each user account.
Correct Answer
C. Connect the service in the VPC with an AWS Private Link endpoint. Have users subscribe to the endpoint.
Explanation
Question 1235
Exam Question
A company is working with an external vendor that requires write access to the company Amazon Simple Queue Service (Amazon SQS) queue. The vendor has its own AWS account.
What should a solutions architect do to implement least privilege access?
A. Update the permission policy on the SQS queue to give write access to the vendor AWS account.
B. Create an IAM user with write access to the SQS queue and share the credentials for the IAM user.
C. Update AWS Resource Access Manager to provide write access to the SQS queue from the vendor AWS account.
D. Create a cross-account role with access to all SQS queues and use the vendor AWS account in the trust document for the role.
Correct Answer
To provide access to the software-as-a-service (SaaS) solution hosted in a VPC to users with the least amount of administrative overhead and without exposing the service to the public internet, AWS PrivateLink is the recommended approach.
AWS PrivateLink enables private connectivity between VPCs, AWS services, and on-premises applications securely over the Amazon network. It allows the software vendor to expose their service privately to authorized users without the need for public IP addresses or internet gateways.
In this scenario, the solution architect should connect the service in the VPC with an AWS Private Link endpoint. The software vendor can create and configure the Private Link endpoint for their service. Users who want to access the service can then subscribe to the endpoint from their VPCs, establishing a secure and private connection to the SaaS solution.
Option A, creating a peering VPC connection from each user’s VPC to the software vendor’s VPC, would involve complex setup and management for multiple connections, increasing administrative overhead.
Option B, deploying a transit VPC and creating VPN or AWS Direct Connect connections with each user account, is not necessary in this scenario as the goal is to avoid exposing the service to the public internet and provide access with minimal administrative overhead.
Therefore, option C, connecting the service with an AWS Private Link endpoint and having users subscribe to it, is the most suitable solution to achieve secure and private access to the SaaS solution.
Explanation
Question 1236
Exam Question
A company runs an application on an Amazon EC2 instance backed by Amazon Elastic Block Store (Amazon EBS). The instance needs to be available for 12 hours daily. The company wants to save costs by making the instance unavailable outside the window required for the application. However, the contents of the instance’s memory must be preserved whenever the instance is unavailable.
What should a solutions architect do to meet this requirement?
A. Stop the instance outside the application’s availability window. Start up the instance again when required.
B. Hibernate the instance outside the application’s availability window. Start up the instance again when required.
C. Use Auto Scaling to scale down the instance outside the application’s availability window. Scale up the instance when required.
D. Terminate the instance outside the application’s availability window. Launch the instance by using a preconfigured Amazon Machine Image (AMI) when required.
Correct Answer
D. Terminate the instance outside the application’s availability window. Launch the instance by using a preconfigured Amazon Machine Image (AMI) when required.
Explanation
To implement least privilege access for the vendor to write to the company’s Amazon SQS queue, creating a cross-account role is the recommended approach.
Option A, updating the permission policy on the SQS queue to give write access to the vendor’s AWS account, can provide the necessary access but may not follow the principle of least privilege. It would grant the vendor more permissions than necessary, potentially exposing sensitive data or allowing unintended actions.
Option B, creating an IAM user with write access to the SQS queue and sharing the credentials, is not recommended as it involves sharing long-term credentials, which can be difficult to manage and track. It is also not recommended to share IAM user credentials with external parties.
Option C, updating AWS Resource Access Manager to provide write access to the SQS queue from the vendor’s AWS account, is not applicable in this context. AWS Resource Access Manager is used for sharing AWS resources across accounts but does not directly control access to individual services like SQS.
Therefore, the most appropriate solution is option D. By creating a cross-account role and specifying the vendor’s AWS account in the trust policy of the role, the vendor can assume the role and gain temporary access to write to the company’s SQS queue. The cross-account role allows granular control and can be limited to only the necessary permissions required for writing to the SQS queue, adhering to the principle of least privilege.
Question 1237
Exam Question
A solutions architect must design a database solution for a high-traffic ecommerce web application. The database stores customer profiles and shopping cart information. The database must support a peak load of several million requests each second and deliver responses in milliseconds. The operational overhead from aging and scaling the database must be minimized.
Which database solution should the solutions architect recommend?
A. Amazon Aurora
B. Amazon DynamoDB
C. Amazon RDS
D. Amazon Redshift
Correct Answer
B. Amazon DynamoDB
Explanation
To meet the requirements of a high-traffic ecommerce web application with a peak load of several million requests per second and fast response times, Amazon DynamoDB is the recommended database solution.
Amazon DynamoDB is a fully managed NoSQL database service that provides fast and scalable performance. It is designed to handle high-traffic workloads and can easily scale to support millions of requests per second. DynamoDB offers automatic scaling capabilities, allowing it to handle varying traffic patterns without significant manual intervention. It also provides low latency, delivering responses in milliseconds, which is crucial for an ecommerce application.
Amazon Aurora (option A) is a highly scalable and performant relational database service, but for the given requirements, where fast response times and minimal operational overhead are emphasized, DynamoDB is a better fit.
Amazon RDS (option C) is a managed relational database service that offers scalability and flexibility, but it may not provide the same level of performance and scalability as DynamoDB for high-traffic workloads.
Amazon Redshift (option D) is a fully managed data warehousing service designed for analytics and reporting. While it offers high scalability for analytical queries, it may not be the best choice for transactional workloads like customer profiles and shopping cart information in an ecommerce web application.
Therefore, based on the given requirements, Amazon DynamoDB is the most suitable database solution.
Question 1238
Exam Question
A solutions architect is creating a new VPC design. There are two public subnets for the load balancer, two private subnets for web servers, and two private subnets for MySQL. The web servers use only HTTPS. The solutions architect has already created a security group for the load balancer allowing port 443 from 0.0.0.0/0. Company policy requires that each resource has the least access required to still be able to perform its tasks.
Which additional configuration strategy should the solutions architect use to meet these requirements?
A. Create a security group for the web servers and allow port 443 from 0.0.0.0/0. Create a security group for the MySQL servers and allow port 3306 from the web servers security group.
B. Create a network ACL for the web servers and allow port 443 from 0.0.0.0/0. Create a network ACL for the MySQL servers and allow port 3306 from the web servers security group.
C. Create a security group for the web servers and allow port 443 from the load balancer. Create a security group for the MySQL servers and allow port 3306 from the web servers security group.
D. Create a network ACL for the web servers and allow port 443 from the load balancer. Create a network ACL for the MySQL servers and allow port 3306 from the web servers security group.
Correct Answer
C. Create a security group for the web servers and allow port 443 from the load balancer. Create a security group for the MySQL servers and allow port 3306 from the web servers security group.
Explanation
To meet the requirements of providing the least access required for each resource while still allowing them to perform their tasks, the solutions architect should use security groups.
For the web servers, a security group should be created and configured to allow incoming traffic on port 443 (HTTPS) from the load balancer. This restricts access to only the load balancer, ensuring that only the necessary traffic is allowed to reach the web servers.
For the MySQL servers, another security group should be created. In this security group, inbound rules should be configured to allow incoming traffic on port 3306 (MySQL) from the security group assigned to the web servers. By doing so, access to the MySQL servers is restricted only to the web servers that require database connectivity.
Network ACLs (option B and D) operate at the subnet level and provide stateless filtering of traffic. They are not the best choice for this scenario, as they cannot control traffic based on the security groups assigned to the instances.
Therefore, option C is the correct choice as it follows the principle of least privilege by using security groups and configuring them to allow the necessary traffic while restricting access from unwanted sources.
Question 1239
Exam Question
A company wants to use high performance computing (HPC) infrastructure on AWS for financial risk modeling. The company HPC workloads run on Linux. Each HPC workflow runs on hundreds of AmazonEC2 Spot Instances, is short-lived, and generates thousands of output files that are ultimately stored in persistent storage for analytics and long-term future use. The company seeks a cloud storage solution that permits the copying of on premises data to long-term persistent storage to make data available for processing by all EC2 instances. The solution should also be a high performance file system that is integrated with persistent storage to read and write datasets and output files.
Which combination of AWS services meets these requirements?
A. Amazon FSx for Lustre integrated with Amazon S3
B. Amazon FSx for Windows File Server integrated with Amazon S3
C. Amazon S3 Glacier integrated with Amazon Elastic Block Store (Amazon EBS)
D. Amazon S3 bucket with a VPC endpoint integrated with an Amazon Elastic Block Store (Amazon EBS) General Purpose SSD (gp2) volume
Correct Answer
A. Amazon FSx for Lustre integrated with Amazon S3
Explanation
To meet the requirements of high-performance file system integrated with persistent storage and the ability to copy on-premises data to long-term storage, Amazon FSx for Lustre integrated with Amazon S3 is the recommended solution.
Amazon FSx for Lustre provides a high-performance file system that is optimized for HPC workloads. It offers low-latency access to data and high throughput, making it suitable for financial risk modeling workloads.
By integrating with Amazon S3, the output files and datasets can be stored in Amazon S3 for long-term storage and analytics. Amazon S3 provides scalable, durable, and highly available object storage.
With this combination of services, the company can leverage the high performance of Amazon FSx for Lustre for efficient data processing on hundreds of EC2 Spot Instances, while also leveraging the durability and scalability of Amazon S3 for long-term storage and data availability.
Option B is not suitable as it is for Windows-based file storage, while the company requires a solution for Linux workloads.
Option C is not the best fit as Amazon S3 Glacier is a storage class designed for long-term archival, not for high-performance file system integration.
Option D is also not a suitable solution as it does not provide a high-performance file system, but rather a bucket in Amazon S3 with a VPC endpoint for access from within the VPC.
Therefore, option A, Amazon FSx for Lustre integrated with Amazon S3, is the most appropriate combination of services for the company’s requirements.
Question 1240
Exam Question
A company hosts historical weather records in Amazon S3. The records are downloaded from the company’s website by a way of a URL that resolves to a domain name. Users all over the world access this content through subscriptions. A third-party provider hosts the company’s root domain name, but the company recently migrated some of its services to Amazon Route 53. The company wants to consolidate contracts, reduce latency for users, and reduce costs related to serving the application to subscribers.
Which solution meets these requirements?
A. Create a web distribution on Amazon CloudFront to serve the S3 content for the application. Create a CNAME record in a Route 53 hosted zone that points to the CloudFront distribution, resolving to the application’s URL domain name.
B. Create a web distribution on Amazon CloudFront to serve the S3 content for the application. Create an ALIAS record in the Amazon Route 53 hosted zone that points to the CloudFront distribution, resolving to the application’s URL domain name.
C. Create an A record in a Route 53 hosted zone for the application. Create a Route 53 traffic policy for the web application, and configure a geolocation rule. Configure health checks to check the health of the endpoint and route DNS queries to other endpoints if an endpoint is unhealthy.
D. Create an A record in a Route 53 hosted zone for the application. Create a Route 53 traffic policy for the web application, and configure a geo proximity rule. Configure health checks to check the health of the endpoint and route DNS queries to other endpoints if an endpoint is unhealthy.
Correct Answer
B. Create a web distribution on Amazon CloudFront to serve the S3 content for the application. Create an ALIAS record in the Amazon Route 53 hosted zone that points to the CloudFront distribution, resolving to the application’s URL domain name.
Explanation
To consolidate contracts, reduce latency, and reduce costs while serving the application to subscribers, the recommended solution is to use Amazon CloudFront in conjunction with Amazon Route 53.
Option B provides the most efficient and cost-effective approach. By creating a web distribution on Amazon CloudFront, the company can leverage its global edge locations to cache and serve the S3 content closer to the users, reducing latency and improving performance.
To route traffic to the CloudFront distribution, an ALIAS record is created in the Amazon Route 53 hosted zone. ALIAS records allow seamless integration between Route 53 and CloudFront, providing a simple and efficient way to point the domain name to the CloudFront distribution.
Option A is not the best fit because using a CNAME record instead of an ALIAS record may introduce additional latency and is not recommended when integrating CloudFront and Route 53.
Options C and D involve configuring traffic policies in Route 53 for geolocation or geo proximity routing, which are not directly related to consolidating contracts, reducing latency, or reducing costs. These options may add complexity without addressing the stated requirements.
Therefore, option B is the most appropriate solution for consolidating contracts, reducing latency, and reducing costs while serving the application to subscribers.