Store SES log emails to S3 with Kinesis
Are you blasting emails? and need to make sure all email was sent properly? you can do it with one of AWS services, it’s called Simple Email Services (SES).
However, SES doesn’t have log in Cloudwatch or S3, we need to have log files that failed to send out.
Reference: https://forums.aws.amazon.com/thread.jspa?threadID=103350
- We can use AWS SNS to get a notification for failed email, however SNS cannot cross-region and we need to store the log with triggering Lambda function.
- We can use Cloudwatch, however cloudwatch not save SES log just provide metrics to see specific event.
- We can use Firehose to stream the logs using kinesis and store it to S3 that set in global service.
Conclusion.
As we need to get metric easily and need to store it, we will go with the Firehose solutions.
To setup, firehose to get logs stream and store it to S3 you can follow these steps.
Setup SES Configuration Sets
- Login to AWS console.
- Find a service with name ‘Simple Email Service’
- Create Configuration sets
Once it’s done, then you can configure the destination type.
- Select the configuration file that you had created before.
- Select a destination type.
- Select Firehose.
- Make sure to tick ‘enable’
- Give your destination type name ex:’my-ses-to-firehose’
- Select the proper event type, you can choose the event that you want to filter.
- Select Stream if you already have a stream, if don’t you can simply create the stream.
- Select IAM Roles to give SES access to kinesis. The role should be like this.
{
"Version": "2012–10–17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Action": [
"firehose:PutRecordBatch"
],
"Resource": [
"arn:aws:firehose:REGION:ACCOUNT-ID:deliverystream/DELIVERY-STREAM-NAME "
]
}
]
}
9. If you don’t have a role to enable SES accessing Kinesis you can click “Create new role yourself”, then adding the above policy to the role.
10. Then Click Save.
11. Then you need to edit trust relationship on your role. Edit with the json like this.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"Service": "ses.amazonaws.com"
},
"Action": "sts:AssumeRole",
"Condition": {
"StringEquals": {
"sts:ExternalId": "[AWS account ID]"
}
}
}
]
}
12. Click edit relationship.
Creating Log Stream
- Go to AWS Kinesis
- Select Data Firehose
- Create Delivery Stream
- Give Delivery stream name.
- Let the sources use “Direct PUT or other sources”, Then click Next
- Let the configuration as default. Then click Next
- You can choose to which AWS services your logs would be stored. In this case, I would like to use S3. Then tick “Amazon S3”
- Choose the bucket if you already have a bucket to store the logs, if no you need to create a new one.
- You can specify the prefix of the bucket (e.g you have a folder inside the bucket and you want to store the logs inside that folder)
- Choose the roles, for enabling Firehose to S3 or KMS key. You may create the new one for this role.
- Then you can review your Settings, before creating a delivery stream. Once done you can click “Create delivery stream” button.
- After the log stream was created, you can test if it is work or not. You can go back to Data Firehose menu and select the stream you want to test
- Select the stream, and click test with demo data.
After running the demo data, you can stop the demo data and check the S3 bucket that you set to store the logs.
If you want your email blast logs can be filtered by this approach, you need to add a header on your message, please put this header on your message:
X-SES-CONFIGURATION-SET: [your SES Configuration set name]
Happy Clouding, guys!
References: