AWS CloudWatch Logs via Firehose
AWS CloudWatch Logs is a service that allows you to monitor, store, and access log files from your AWS resources and applications. It can collect logs from various AWS services such as EC2, Lambda, API Gateway, and more. For better usability and control, you can send these logs to Dash0 via Firehose stream.
CloudWatch Logs events can be sent to Firehose using CloudWatch subscription filters. Firehose can then deliver the log events to Dash0 through an HTTP endpoint. This guide will teach you how.
Follow AWS documentation to create a Firehose delivery stream from AWS console:
1. Choose source and destination
Choose "Direct PUT" as the source and "HTTP Endpoint" as the destination of your Firehose stream.
2. Firehose stream name
Give your Firehose stream a name.
3. Destination settings
In HTTP Endpoint URL, enter AWS CloudWatch Logs via Firehose HTTP endpoint:
Create a Dash0 authentication token for the Firehose stream. By default, data will be sent to the default dataset. If you want to send the data to another dataset, you can restrict the token access to that specific dataset. It's also recommended to grant only the ingestion permission to this token.
In the authentication section, you can either choose "Use access key" option and enter the authentication token directly as the Access Key:
or use AWS Secrets Manager to retrieve the token programmatically.
In the Parameters section, you can add additional attributes to the log events.
For example, you can add cloud.region
as key and specify the region of your firehose stream such as eu-west-1
as value.
This attribute will be added to the log group resource in Dash0 and allows to distinguish log groups from different regions.
You can keep the rest of the settings as default. Data sent from CloudWatch Logs to Amazon Data Firehose is already compressed with gzip level 6 compression, so you do not need to use compression within your Firehose delivery stream.
4. Backup settings
Create or choose an existing S3 bucket to store data in case of delivery failures.
Follow AWS documentation to create a CloudWatch subscription filter that sends any incoming log events that match your defined filters to your Firehose delivery stream previously created. You can skip step 1-7 in the AWS documentation which are about creating a Firehose delivery stream and go directly to step 8.
1. Create an IAM role to set up permissions for CloudWatch Logs to send data to Firehose
Step 8-11 in the AWS documentation will guide you through creating such IAM role through the command line.
If you are using the AWS Management Console, you can go to IAM > Roles and create a new role.
Choose Custom trust policy and update the trust policy statement with:
Click Next to go to the Add Permissions step. You can either select AmazonKinesisFirehoseFullAccess
policy or create a custom
policy which only grants firehose:PutRecord permission
to the role with the following statement.
At the final step, name the role and click Create role to complete the setup.
2. Create a subscription filter for the log group
In the AWS console, go to CloudWatch > Logs > Log groups and select the log group you want to send to Firehose. Click Actions > Create subscription filter > Create Amazon Data Firehose Subscription filter.
In Choose destination, select the Firehose delivery stream you created in the previous step.
In Grant permission, select the IAM role you created in the previous step.
Choose the log format and filter pattern for the log data you want to send to Firehose.
Once everything is set up, you can find CloudWatch logs by using the built-in AWS CloudWatch logging view.