« Back to News

Amazon Kinesis Data Firehose for detail logging in AWS WAF

Amazon Web Services (AWS) provides various tools which you can use to monitor AWS WAF (Web Application Firewall).

In our previous blog, we have seen how we can centrally configure and manage AWS WAF rules across multiple accounts and applications using AWS Firewall Manager.

AWS now provides a new service for detailed logging called Amazon Kinesis Data Firehose,  which is a fully managed service for delivering real-time streaming data to destinations such as Amazon Simple Storage Service (Amazon S3), Amazon Redshift, Amazon Elasticsearch Service (Amazon ES), and Splunk.

Amazon Kinesis Data Firehose stores all logs in the S3 bucket. These detailed logs give us more information about why certain rules are triggered and why certain rules are blocked with our specific ACL (Access Control List) rules.

For Amazon S3 destinations, streaming data is delivered to your S3 bucket. If data transformation is enabled, you can optionally back up source data to another Amazon S3 bucket.

AWS Firehose Overview Image

This is done in two steps assuming WAF is already implemented (if not, first implement WAF with the help of our blog):

  1. Adding Kinesis Data Firehose stream and choosing the destination for the data from Amazon S3, Amazon Elasticsearch, and Amazon Redshift.
  2. From existing WAF configurations, and enabling detailed logs using Firehose instance.

Implementation

Follow these steps for implementing Amazon Kinetic Data Firehose:

  1. Select ‘Kinesis Data Firehose’ for a new instance.
Firehose Delivery Streams
  1. Add a unique name for Firehose
Delivery Stream Name
  1. Choose a source to send records to the delivery stream
Delivery Stream Source Record
  1. Transform source records
Transform Source Record with AWS Lambda
  1. Convert record format
Convert Record Format
  1. Select the destination
Select Destination
  1. Configure all setting as per our requirement for the S3 bucket and select ‘Enabled’
Error Logging
  1. Create IAM role with the following policy :

With the name firehose_delivery_role

  • iam:CreateServiceLinkedRole
  • firehose:ListDeliveryStreams
  • firehose:PutLoggingConfiguration

Select created IAM role firehose_delivery_role in the Kinesis Data Firehose instance creation

  1. Review all configurations and create a delivery stream

Finally, you will able to see the active firehose in the console.

Active Firehose
  1. Enable logging

Navigate to the WAF console, choose the region where the WAF is configured in the Logging tab and configure the section for ‘Enable Logging’.

Enable Logging 1
Enable Logging 2

We should now be seeing detail logging in Kinesis service:

S3 Logs

For testing our setup, we can use demo data

Demo Data Test

That’s it. Have you already configured and started using Amazon Kinesis Data Firehose? What challenges did you face? What did you learn? If you need help, please leave a comment below and an AWS expert will get in touch with you.

Leave a Reply

About the Writer

  • Farida Pathan

    Farida Pathan is a DevOps engineer with a diversified background in DevOps, automation, scripting, etc. She has a deep interest in helping customers using Cloud, Containers, Orchestration which motivates her to solve a complex problem using simpler and effective solutions.

How can Synerzip Help You?

By partnering with Synerzip, clients rapidly scale their engineering team, decrease time to market and save at least 50 percent with our Agile development teams in India.