- Newest
- Most votes
- Most comments
Greeting
Hi Issei,
Thanks for reaching out! Your question about batching multiple log records in a single PutLogEvents
call when forwarding AWS Fargate stdout to CloudWatch Logs is both practical and relevant. Let’s explore the solution together.
Clarifying the Issue
You’re dealing with a scenario where Fargate’s stdout logs are sent to CloudWatch Logs, and you want to confirm if multiple records can be batched into a single PutLogEvents
call. Specifically, you’re trying to use a subscription filter to forward logs to Amazon Kinesis Data Firehose, but you’re concerned about inefficiencies caused by handling one record per call.
Your goal is to maximize efficiency by batching multiple records into a single PUT request, reducing processing overhead and potential bottlenecks.
Key Terms
- PutLogEvents API: Allows up to 10,000 log events or 1 MB of data to be sent to a CloudWatch log stream in a single API call.
- AWS Fargate stdout: The default output of containerized applications running on Fargate, which can be forwarded to CloudWatch Logs.
- CloudWatch Subscription Filter: A feature that enables the streaming of logs from CloudWatch to destinations like Kinesis Firehose in near real-time.
- Amazon Kinesis Data Firehose: A managed service for streaming data into destinations such as S3, Redshift, and Elasticsearch, with buffering and retry capabilities.
The Solution (Our Recipe)
Steps at a Glance:
- Confirm that
PutLogEvents
supports batching multiple records in a single API call. - Structure your Fargate application logs for batching.
- Configure a CloudWatch subscription filter to forward logs in batches to Kinesis Firehose.
- Optimize Kinesis Firehose buffer settings for efficient log processing.
Step-by-Step Guide:
-
Confirm Batch Limits of
PutLogEvents
:
AWS CloudWatchPutLogEvents
allows up to 10,000 log events or 1 MB of data per batch. These constraints apply to both the number of records and the payload size, so ensure your application aggregates logs to stay within these limits.Example CLI snippet to send logs:
aws logs put-log-events --log-group-name "my-log-group" \ --log-stream-name "my-log-stream" \ --log-events '[{"timestamp": 1672185600000, "message": "Record 1"}, {"timestamp": 1672185601000, "message": "Record 2"}]'
- Update Fargate Logging Configuration:
Modify your ECS Fargate task definition to use theawslogs
driver and set batching parameters if possible. For example:
If your application uses a logging library, configure it to buffer and batch log output before sending to CloudWatch Logs."logConfiguration": { "logDriver": "awslogs", "options": { "awslogs-group": "my-log-group", "awslogs-region": "us-east-1", "awslogs-stream-prefix": "my-app" } }
- Configure a CloudWatch Subscription Filter:
Create a subscription filter to forward logs to Kinesis Firehose. Use the following CLI command:
The filter patternaws logs put-subscription-filter --log-group-name "my-log-group" \ --filter-name "MyFilter" \ --filter-pattern "" \ --destination-arn "arn:aws:firehose:us-east-1:123456789012:deliverystream/my-stream"
""
ensures all log records are forwarded.
-
Optimize Firehose Buffer Settings:
Adjust the buffer size and interval in Kinesis Firehose to optimize throughput. For example:- Buffer Size: Set to 5 MB for high-volume logs.
- Buffer Interval: Set to 300 seconds (5 minutes) for larger intervals or reduce for near real-time delivery.
These settings can be configured in the Firehose console or via the AWS CLI:
aws firehose update-destination --delivery-stream-name my-stream \ --current-delivery-stream-version-id 1 \ --destination-update '{"ExtendedS3DestinationUpdate": {"BufferingHints": {"SizeInMBs": 5, "IntervalInSeconds": 300}}}'
Closing Thoughts
By batching log events through PutLogEvents
and configuring your subscription filter and Firehose settings effectively, you can minimize inefficiencies while streaming logs from AWS Fargate. These adjustments will optimize performance, reduce costs, and ensure your system scales effectively with your log volume.
Farewell
I hope this walkthrough helps you set up a robust and efficient logging pipeline, Issei! Let me know if you need further assistance. Good luck with your project! 😊🚀
Cheers,
Aaron 😊
Relevant content
- AWS OFFICIALUpdated 8 months ago
- AWS OFFICIALUpdated 8 months ago
- AWS OFFICIALUpdated 10 months ago
- AWS OFFICIALUpdated 8 months ago