Skip to Content

SAP-C02: Real-Time AWS Monitoring with Splunk Using Amazon Kinesis Data Firehose

Learn how to configure Amazon Kinesis Data Firehose for real-time monitoring of AWS EC2 instances with Splunk. Discover how VPC flow logs, CloudWatch, and AWS Lambda functions can provide comprehensive, near-real-time insights.

Table of Contents

Question

A company has developed a hybrid solution between its data center and AWS. The company uses Amazon VPC and Amazon EC2 instances that send application logs to Amazon CloudWatch. The EC2 instances read data from multiple relational databases that are hosted on premises.

The company wants to monitor which EC2 instances are connected to the databases in near-real time. The company already has a monitoring solution that uses Splunk on premises. A solutions architect needs to determine how to send networking traffic to Splunk.

How should the solutions architect meet these requirements?

A. Enable VPC flows logs, and send them to CloudWatch. Create an AWS Lambda function to periodically export the CloudWatch logs to an Amazon S3 bucket by using the pre-defined export function. Generate ACCESS_KEY and SECRET_KEY AWS credentials. Configure Splunk to pull the logs from the S3 bucket by using those credentials.
B. Create an Amazon Kinesis Data Firehose delivery stream with Splunk as the destination. Configure a pre-processing AWS Lambda function with a Kinesis Data Firehose stream processor that extracts individual log events from records sent by CloudWatch Logs subscription filters. Enable VPC flows logs, and send them to CloudWatch. Create a CloudWatch Logs subscription that sends log events to the Kinesis Data Firehose delivery stream.
C. Ask the company to log every request that is made to the databases along with the EC2 instance IP address. Export the CloudWatch logs to an Amazon S3 bucket. Use Amazon Athena to query the logs grouped by database name. Export Athena results to another S3 bucket. Invoke an AWS Lambda function to automatically send any new file that is put in the S3 bucket to Splunk.
D. Send the CloudWatch logs to an Amazon Kinesis data stream with Amazon Kinesis Data Analytics for SQL Applications. Configure a 1-minute sliding window to collect the events. Create a SQL query that uses the anomaly detection template to monitor any networking traffic anomalies in near-real time. Send the result to an Amazon Kinesis Data Firehose delivery stream with Splunk as the destination.

Answer

B. Create an Amazon Kinesis Data Firehose delivery stream with Splunk as the destination. Configure a pre-processing AWS Lambda function with a Kinesis Data Firehose stream processor that extracts individual log events from records sent by CloudWatch Logs subscription filters. Enable VPC flows logs, and send them to CloudWatch. Create a CloudWatch Logs subscription that sends log events to the Kinesis Data Firehose delivery stream.

Explanation

This approach uses Amazon Kinesis Data Firehose, which is a fully managed service for delivering real-time streaming data to destinations such as Splunk. It can capture, transform, and load streaming data into the destinations, enabling near-real-time analytics with existing business intelligence tools.

By enabling VPC flow logs and sending them to CloudWatch, you can capture information about the IP traffic going to and from network interfaces in the VPC. CloudWatch Logs subscriptions can then be used to send these log events to the Kinesis Data Firehose delivery stream.

The pre-processing AWS Lambda function configured with the Kinesis Data Firehose stream processor can extract individual log events from the records sent by the CloudWatch Logs subscription filters. This allows for more granular analysis of the log data.

This solution meets the requirement of monitoring which EC2 instances are connected to the databases in near-real time and integrates with the company’s existing Splunk solution.

Amazon AWS Certified Solutions Architect – Professional SAP-C02 certification exam practice question and answer (Q&A) dump with detail explanation and reference available free, helpful to pass the Amazon AWS Certified Solutions Architect – Professional SAP-C02 exam and earn Amazon AWS Certified Solutions Architect – Professional SAP-C02 certification.