Analyze AWS Traffic Using Flow Logs | Fit-DevOps

Rahul K
9 min readSep 30, 2021

This blog was originally published at https://fitdevops.in

All the Latest updates and Content will be published there.

Don’t forget to check as I will post content every day…

In this blog , We will see how to setup a logging to analyze the IP traffic that is going to and from network interfaces from VPC.

We will cover to,

  • Publish flow logs to Cloudwatch Log group
  • Publish flow logs to S3 buckets

If you’re planning to build a Custom VPC with Public and Private Subnets to host your applications , Refer this article.

What is VPC Flow Logs?

  • VPC flow logs is a feature which is used to capture the information about the IP traffic going to and from network interfaces in the VPC.
  • We can configure configure flow logs to capture those information and sent it to either Cloudwatch Log groups or S3 bucket.
  • Once the logging is sent to one of the destination , We can then use those data for further analysis.

VPC flow logs is used for,

  • Monitoring the traffic that is coming to the AWS resources such as EC2 Instances
  • Determining the direction of the traffic to and from the network interfaces.
  • Diagnosing overly restrictive security group rules.

Collecting the flow logs doesn’t impact the network throughput or latency.

We will charged delivering logs to Cloudwatch and S3 bucket.

Understanding Flow Logs

  • Flow logs can be created for VPC , Network Interfaces or Subnets.
  • If the flow logs is configured for a VPC , All the subnets and the network intefaces within that VPC will be monitored.
  • If the flow logs is enabled on subnet level , Then the network interfaces within that subnet will be monitored.
  • The flow logs data which are collected from network intefaces , subnets or VPC is referred as Flow Log records.

We need following information while creating and configuring flow logs,

a) The type of the traffic that needs to be captured such as accepted traffic, rejected traffic.

b) The location where flow logs data should be stored , such as Cloudwatch Log groups or S3 buckets.

c) The resource type ( Subnet , Network interface , VPC ) for which the vpc flow logs should be created.

We can tag flow logs that are created for each network interfaces , subnets or VPC’s.

Flow Log records

By default , The flow logs can record the following information.

  • Accepted and rejected traffic
  • Traffic through NAT Gateway
  • Traffic through Transit Gateway
  • TCP flag sequence
  • Security group and Network access control List Rules
  • IPv6 Traffic
  • No data and skipped records

As informed earlier , Flow logs data can be published to two destinations.

  • Cloudwatch Log group
  • S3 Bucket

Lets implement flow logs for both of them.

Publishing Flow Logs to Cloudwatch Log group

  • All the flow logs data can be directly published to Cloudwatch Log group.
  • The Cloudwatch log streams will be created for each network interfaces.And the log streams will have flow log records.
  • We can create multiple flow logs and based on the traffic event (Eg: Accepted traffic), We can sent them to Cloudwatch log group.
  • Before creating flow logs , We need to grant permission for the flow logs to publish logs to Cloudwatch Log group.

Lets create IAM Role for the Flow logs to send data to Cloudwatch Log group.

The minimum permission required for the Flow Logs are CreateLogGroup , DescribeLogGroups , CreateLogStream , DescribeLogStreams , PutLogEvents.

To Create a policy , Login to IAM Console , In the Navigation pane , Choose Policies , Click Create Policy

In the JSON tab , Replace the existing policy with the below contents and Provide a name for the policy and click Create policy

{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents",
"logs:DescribeLogGroups",
"logs:DescribeLogStreams"
],
"Effect": "Allow",
"Resource": "*"
}
]
}

Once the policy is created , We need to create an IAM role and attach the policy with it.

The role will be used later while created Flow Logs.

To Create an IAM Role , Choose Roles and click Create Role

Then Choose EC2 as a Service to create a Role.

and click Next: Permissions

Search for the policy name which we have created.Select it.

And finally provide a name for the role and then click Create Role.

Also we should make sure that the role has trusted relationship that allow flow logs service to assume the role.

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"Service": "vpc-flow-logs.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}

For that , Choose the Role that we have created , Select Trust Relationships

and click Edit trust relationship

In the place of ec2.amazonaws.com , Replace it with vpc-flow-logs.amazonaws.com and then click Update trust policy

Now we understood and created the minimum permissions needed for the flow logs service to publish flow logs data to Cloudwatch Log group.

Note the ARN of the Role (Role ARN) which will be used while creating flow logs.

As explained earlier , We can create flow logs for network interfaces , subnets or VPC.

To create flow logs for Network interfaces , Login to EC2 Console , In the navigation pane , Choose Network Interfaces

Select the network interface and Under Actions , Click Create flow log

For Filter , It will ask for the type of traffic that needs to be recorded.

Choose All to log rejected and accepted traffics.

For Maximum aggregation interval , Choose the maximum period of time during which a flow log is captured and aggregated into one flow logs record.

For Destination , Choose Send to Cloudwatch logs

For Destination log group , For this we should have a Cloudwatch log group already created.

To create a Log group , Go to Cloudwatch Console , In the navigation pane , Under Logs , Choose Log groups

Click Create log group , Provide a name and click Create.

Make sure to create log group in the same region where flow logs will be created.

Once we have the log group , Continue the flow log setup.

Choose the Log group we have created.

For IAM role , Choose the role that you have created.

For Log record format , We can choose either the default format or we can configure custom format if required.

Add a tag for the flow log and click Create.

To check the flow logs , Select the network interface , Choose Flow Logs

To delete a flow logs for a network interface , You will find delete button (x).

TO CREATE FLOW LOG FOR SUBNET , Login to VPC Console , In the navigation pane , Choose Subnets.

Choose the subnet , Under Actions Create Create flow log

The process of creating flow logs is the same as creating for network interfaces.

TO CREATE FLOW LOG FOR VPC , Select the VPC , Under Actions , Click Create flow log

Using the same procedure we can create flow logs for the VPC.

Now We know how to configure flow logs for Network interfaces , Subnets and VPC and publish the flow logs to Cloudwatch Log group.

Publishing Flow Logs to S3 Bucket

  • If you wish to send flow logs data to S3 bucket instead of Cloudwatch log group. We can configure it.
  • Flow logs can publish flow logs data to S3 bucket.
  • We should have an existing S3 bucket for the flow logs to send the data to S3 bucket.
  • The traffic information collected by flow logs will be sent to S3 bucket and stored as log file objects.

To Create a S3 bucket , Login to S3 Console. Click Create bucket

Provide a name for the bucket and Choose the region where the bucket should be created and click Create.

Understanding Flow log files

Flow logs collect flow log records , consolidate them to log files and then publishes the log files to the S3 bucket at 5 minutes intervals.

Each log files will have the details about the IP traffic recorded in the last 5 minutes.

The Maximum file size each log file can be 75 MB.If the log file reaches the file limit , The flow logs will create a new log file and starts storing log records there.

Then the log files will be stored in the S3 bucket.

Creating S3 Bucket Polices for Flow Logs

By default, All the S3 buckets and objects we create are private.

The below S3 Bucket policy grants the flow logs permission to publish logs to S3 bucket.

{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AWSLogDeliveryWrite",
"Effect": "Allow",
"Principal": {"Service": "delivery.logs.amazonaws.com"},
"Action": "s3:PutObject",
"Resource":
"arn:aws:s3:::bucket_name/optional_folder/AWSLogs/account_id/*",
"Condition": {"StringEquals": {"s3:x-amz-acl": "bucket-owner-full-control"}}
},
{
"Sid": "AWSLogDeliveryAclCheck",
"Effect": "Allow",
"Principal": {"Service": "delivery.logs.amazonaws.com"},
"Action": "s3:GetBucketAcl",
"Resource": "arn:aws:s3:::bucket_name"
}
]
}

If the the above permission is not attached with the respective S3 bucket , Then the flow logs will automatically apply permissions while creating flow logs.

Lets go ahead and create flow logs for the subnets.

Login to VPC Console , In the navigation pane , Choose Subnets

Select the subnet and Click Create flow log.

Choose the type of traffic that should be recorded. Choose ALL for to record Accepted and Rejected traffic.

For Destination , Choose Send to S3 bucket

And then provide the ARN of the S3 bucket.

For Log record format , We can either use the AWS default format or we can write a custom format for the logs which will be stored in the S3 bucket as Log file.

and click Create.

To View the flow logs , Choose the Subnet and Select Flow Logs

To Create flow logs for the Network Interfaces , Login to EC2 Console , In the navigation pane , Choose Network Interfaces , Select the network interface , Under Actions , Choose Create flow logs

Following the above procedure we can create flow logs for Network interfaces and VPC.

Conclusion

We have implemented flow logs for Network interfaces , Subnets and VPC to analyze the traffic.

Hope you find it really helpful to understand the traffic flow across the aws resources.

Please do check out my other articles.

Originally published at https://fitdevops.in.

--

--

Rahul K

Pro-Active Devops Engineer with 5+ years of experience in Linux , Amazon Web Services, Azure , GCP , Devops tools. Blogs here : https://fitdevops.in