Skip to Content

AWS Certified Solutions Architect – Professional SAP-C02 Exam Questions and Answers – 6

The latest AWS Certified Solutions Architect – Professional SAP-C02 certification actual real practice exam question and answer (Q&A) dumps are available free, which are helpful for you to pass the AWS Certified Solutions Architect – Professional SAP-C02 exam and earn AWS Certified Solutions Architect – Professional SAP-C02 certification.

Question 531

Exam Question

A retail company has hired you as an AWS Certified Solutions Architect Professional to provide consultancy for managing a serverless application that consists of multiple API gateways, Lambda functions, S3 buckets and DynamoDB tables. The company is getting reports from customers that some of the application components seem to be lagging while loading dynamic images and some are timing out with the “504 Gateway Timeout” error. As part of your investigations to identify the root cause behind this issue, you can confirm that DynamoDB monitoring metrics are at acceptable levels.

Which of the following steps would you recommend to address these application issues? (Select two)

A. Process and analyze the VPC Flow Logs to determine if there is packet loss between the Lambda function and S3.

B. Process and analyze the Amazon CloudWatch Logs for Lambda function to determine processing times for requested images at pre-configured intervals.

C. Process and analyze the AWS X-Ray traces and analyze HTTP methods to determine the root cause of the HTTP errors.

D. Enable execution logging for the API Gateway. Process and analyze the execution logs in the API Gateway for HTTP errors to determine the root cause of the errors.

E. Enable access logging for the API Gateway. Process and analyze the access logs in the API Gateway for HTTP errors to determine the root cause of the errors.

Correct Answer

B. Process and analyze the Amazon CloudWatch Logs for Lambda function to determine processing times for requested images at pre-configured intervals.

C. Process and analyze the AWS X-Ray traces and analyze HTTP methods to determine the root cause of the HTTP errors.

Question 532

Exam Question

Which combination of components will enable the company to create a monitoring solution that will satisfy these requirements? (Choose two.)

A. Use Amazon Kinesis Data Firehose to buffer events. Create an AWS Lambda function to process and transform events.

B. Create an Amazon Kinesis data stream to buffer events. Create an AWS Lambda function to process and transform events.

C. Configure an Amazon Aurora PostgreSQL DB cluster to receive events. Use Amazon QuickSight to read from the database and create near-real-time visualizations and dashboards.

D. Configure Amazon Elasticsearch Service (Amazon ES) to receive events. Use the Kibana endpoint deployed with Amazon ES to create near-real-time visualizations and dashboards.

E. Configure an Amazon Neptune DB instance to receive events. Use Amazon QuickSight to read from the database and create near-real-time visualizations and dashboards.

Correct Answer

B. Create an Amazon Kinesis data stream to buffer events. Create an AWS Lambda function to process and transform events.

C. Configure an Amazon Aurora PostgreSQL DB cluster to receive events. Use Amazon QuickSight to read from the database and create near-real-time visualizations and dashboards.

Question 533

Exam Question

A medical technology company has recently set up a hybrid cloud between its on-premises data centers and AWS Cloud. The engineering team at the company has developed a Media Archiving and Communication System application that runs on AWS to support real-time collaboration among radiologists and other specialists. The company uses Amazon S3 to aggregate the raw medical images and video footage from its research teams across the world to discover tremendous medical insights. The technical teams at the overseas research facilities have reported huge delays in uploading large video files to the destination S3 bucket.

As a Solutions Architect Professional, which of the following would you recommend as the MOST cost-effective solutions to improve the file upload speed into S3? (Select two)

A. Use AWS Global Accelerator for faster file uploads into the destination 53 bucket.

B. Create multiple AWS direct connect connections between the AWS Cloud and research facilities running in the on-premises data centers. Use the direct connect connections for faster file uploads into S3.

C. Use multipart uploads for faster file uploads into the destination S3 bucket.

D. Create multiple site-to-site VPN connections between the AWS Cloud and research facilities running in the on-premises data centers. Use these VPN connections for faster file uploads into 53.

E. Use Amazon 53 Transfer Acceleration to enable faster file uploads into the destination S3 bucket.

Correct Answer

C. Use multipart uploads for faster file uploads into the destination S3 bucket.

E. Use Amazon 53 Transfer Acceleration to enable faster file uploads into the destination S3 bucket.

Question 534

Exam Question

A company has an application written using an in-house software framework. The framework installation takes 30 minutes and is performed with a user data script. Company Developers deploy changes to the application frequently. The framework installation is becoming a bottleneck in this process.

Which of the following would speed up this process?

A. Create a pipeline to build a custom AMI with the framework installed and use this AMI as a baseline for application deployments.

B. Employ a user data script to install the framework but compress the installation files to make them smaller.

C. Create a pipeline to parallelize the installation tasks and call this pipeline from a user data script.

D. Configure an AWS OpsWorks cookbook that installs the framework instead of employing user data. Use this cookbook as a base for all deployments.

Correct Answer

A. Create a pipeline to build a custom AMI with the framework installed and use this AMI as a baseline for application deployments.

Question 535

Exam Question

An analytics company has developed a solution for location intelligence for mobile advertising wherein the company’s flagship product extracts contextual intelligence generated from large amounts of location based data such as place data, event data, digital, mobile and social data as well as a diversity of other location specific information. The product then performs a series of data cleansing. normalization, analysis, and machine learning processes to extract an understanding of the type of audience present in a specific location at a specific time enabling the ability to audience target in mobile highly effectively. The company wants to leverage ElastiCache for Redis in cluster mode to enhance the performance and scalability of its existing two-tier application architecture. The ElastiCache cluster is configured to listen on port 6379. The company has hired you as an AWS Certified Solutions Architect Professional to build a secure solution so that the cache data is secure and protected from unauthorized access.

Which of the following steps would address the given use-case? (Select three)

A. Enable CloudTrail to monitor the API Calls for the ElastiCache cluster .

B. Enable CloudWatch Logs to monitor the security credentials for the ElastiCache cluster.

C. Configure the security group for the ElastiCache cluster with the required rules to allow outbound traffic to the cluster’s clients on port 6379.

D. Configure the ElastiCache cluster to have both in-transit as well as at-rest encryption.

E. Create the cluster with auth-token parameter and make sure that the parameter is included in all subsequent commands to the cluster.

F. Configure the security group for the ElastiCache cluster with the required rules to allow inbound traffic from the cluster itself as well as from the cluster’s clients on port 6379.

Correct Answer

C. Configure the security group for the ElastiCache cluster with the required rules to allow outbound traffic to the cluster’s clients on port 6379.

D. Configure the ElastiCache cluster to have both in-transit as well as at-rest encryption.

F. Configure the security group for the ElastiCache cluster with the required rules to allow inbound traffic from the cluster itself as well as from the cluster’s clients on port 6379.

Question 536

Exam Question

A company is planning the migration of several lab environments used for software testing. An assortment of custom tooling is used to manage the test runs for each lab. The labs use immutable infrastructure for the software test runs, and the results are stored in a highly available SQL database cluster. Although completely rewriting the custom tooling is out of scope for the migration project, the company would like to optimize workloads during the migration.

Which application migration strategy meets this requirement?

A. Re-host

B. Re-platform

C. Re-factor/re-architect

D. Retire

Correct Answer

A. Re-host

Question 537

Exam Question

The engineering team at a social media company is building an ElasticSearch based index for all the existing files in S3. To build this index, it only needs to read the first 250 bytes of each object in S3. which contains some metadata about the content of the file itself. There are over 100,000 files in your S3 bucket, adding up to 50TB of data.

As a Solutions Architect Professional, which of the following solutions can be used to build this index MOST efficiently? (Select two)

A. Create an application that will traverse the 53 bucket, issue a Byte Range Fetch for the first 250 bytes, and store that information in ElasticSearch.

B. Use the Database Migration Service to load the entire data from 53 to ElasticSearch and then ElasticSearch would automatically build the index.

C. Create an application that will use the 53 Select ScanRange parameter to get the first 250 bytes and store that information in ElasticSearch.

D. Create an application that will traverse the S3 bucket read the entire files one by one, extract the first 250 bytes, and store that information in ElasticSearch.

E. Use the ElasticSearch Import feature to load the entire data from 53 to ElasticSearch and then ElasticSearch would automatically build the index.

Correct Answer

A. Create an application that will traverse the 53 bucket, issue a Byte Range Fetch for the first 250 bytes, and store that information in ElasticSearch.

C. Create an application that will use the 53 Select ScanRange parameter to get the first 250 bytes and store that information in ElasticSearch.

Question 538

Exam Question

A company has a website that enables users to upload videos. Company policy states the uploaded videos must be analyzed for restricted content. An uploaded video is placed in Amazon S3, and a message is pushed to an Amazon SQS queue with the video’s location. A backend application pulls this location from Amazon SQS and analyzes the video. The video analysis is compute-intensive and occurs sporadically during the day. The website scales with demand. The video analysis application runs on a fixed number of instances. Peak demand occurs during the holidays, so the company must add instances to the application during this time. All instances used are currently on-demand Amazon EC2 T2 instances. The company wants to reduce the cost of the current solution.

Which of the following solutions is MOST cost-effective?

A. Keep the website on T2 instances. Determine the minimum number of website instances required during off-peak times and use Spot Instances to cover them while using Reserved Instances to cover the peak demand. Use Amazon EC2 R4 and Amazon EC2 R5 Reserved Instances in an Auto Scaling group for the video analysis application.

B. Keep the website on T2 instances. Determine the minimum number of website instances required during off-peak times and use Reserved Instances to cover them while using On-Demand Instances to cover peak demand. Use Spot Fleet for the video analysis application comprised of Amazon EC2 C4 and Amazon EC2 C5 Spot Instances.

C. Migrate the website to AWS Elastic Beanstalk and Amazon EC2 C4 instances. Determine the minimum number of website instances required during off-peak times and use On-Demand Instances to cover them while using Spot capacity to cover peak demands. Use Spot Fleet for the video analysis application comprised of C4 and Amazon EC2 C5 instances.

D. Migrate the website to AWS Elastic Beanstalk and Amazon EC2 R4 instances. Determine the minimum number of website instances required during off-peak times and use Reserved Instances to cover them while using On-Demand Instances to cover peak demands. Use Spot Fleet for the video analysis application comprised of R4 and Amazon EC2 R5 instances.

Correct Answer

B. Keep the website on T2 instances. Determine the minimum number of website instances required during off-peak times and use Reserved Instances to cover them while using On-Demand Instances to cover peak demand. Use Spot Fleet for the video analysis application comprised of Amazon EC2 C4 and Amazon EC2 C5 Spot Instances.

Question 539

Exam Question

A big data analytics company analyzes customer movement data for brick-and-mortar retailers to help them optimize marketing. merchandising, and operations performance by measuring foot traffic, store visits, walk-by conversion, bounce rate, visit duration, and customer loyalty. The company leverages its proprietary analytics workflows built on Redshift to correlate traffic with marketing campaigns and to help retailers optimize hours for peak traffic, among other activities. The company has hired you as an AWS Certified Solutions Architect Professional to review the company’s Redshift cluster, which has now become an integral part of its technology solutions. You have been asked to improve the reliability and availability of the cluster in case of a disaster and provide options to ensure that if an issue arises, the cluster can either operate or be restored within five hours.

Which of the following would you suggest as the BEST solution to meet the business needs in the most cost-effective way?

A. Set up two identical Amazon Redshift clusters in different regions in a primary-secondary configuration. Create a cron job to run the UNLOAD command every five hours to export data for all tables in primary cluster to S3. Use cross-region replication from the primary region to secondary region. Create another cron job to ingest the data for all tables from S3 into the secondary cluster using the LOAD command.

B. Set up a CloudFormation stack set for Redshift cluster creation so it can be launched in another Region and configure Amazon Redshift to automatically copy snapshots for the cluster to the other AWS Region. In case of a disaster, restore the cluster in the other AWS Region from that Region’s snapshot.

C. Set up two identical Amazon Redshift clusters in different regions in a primary-secondary configuration. Develop a solution using the Kinesis Data Streams to collect the data prior to ingestion into the primary Redshift cluster and stream the data to the secondary cluster.

D. Configure the Amazon Redshift cluster to make use of Auto Scaling groups with the nodes in the cluster spread across multiple Availability Zones (AZs). In case of a disaster, the nodes in the other AZs will ensure reliability and availability.

Correct Answer

B. Set up a CloudFormation stack set for Redshift cluster creation so it can be launched in another Region and configure Amazon Redshift to automatically copy snapshots for the cluster to the other AWS Region. In case of a disaster, restore the cluster in the other AWS Region from that Region’s snapshot.

Question 540

Exam Question

A company has an internal AWS Elastic Beanstalk worker environment inside a VPC that must access an external payment gateway API available on an HTTPS endpoint on the public internet. Because of security policies, the payment gateway’s Application team can grant access to only one public IP address.

Which architecture will set up an Elastic Beanstalk environment to access the company’s application without making multiple changes on the company’s end?

A. Configure the Elastic Beanstalk application to place Amazon EC2 instances in a private subnet with an outbound route to a NAT gateway in a public subnet. Associate an Elastic IP address to the NAT gateway that can be whitelisted on the payment gateway application side.

B. Configure the Elastic Beanstalk application to place Amazon EC2 instances in a public subnet with an internet gateway. Associate an Elastic IP address to the internet gateway that can be whitelisted on the payment gateway application side.

C. Configure the Elastic Beanstalk application to place Amazon EC2 instances in a private subnet. Set an HTTPS_PROXY application parameter to send outbound HTTPS connections to an EC2 proxy server deployed in a public subnet. Associate an Elastic IP address to the EC2 proxy host that can be whitelisted on the payment gateway application side.

D. Configure the Elastic Beanstalk application to place Amazon EC2 instances in a public subnet. Set the HTTPS_PROXY and NO_PROXY application parameters to send non-VPC outbound HTTPS connections to an EC2 proxy server deployed in a public subnet. Associate an Elastic IP address to the EC2 proxy host that can be whitelisted on the payment gateway application side

Correct Answer

A. Configure the Elastic Beanstalk application to place Amazon EC2 instances in a private subnet with an outbound route to a NAT gateway in a public subnet. Associate an Elastic IP address to the NAT gateway that can be whitelisted on the payment gateway application side.