Questions tagged with Amazon Data Firehose

Amazon Data Firehose provides the easiest way to acquire, transform, and deliver data streams within seconds to data lakes, data warehouses, and analytics services.

Content language: English

Select tags to filter
Sort by most recent

Browse through the questions and answers listed below or filter and sort to narrow down your results.

146 results
I have Firehouse which sends data to Redshift. It was working fine and then all of sudden it stopped sending data to Redshift. It sends the data fine to s3 but fails to send to Redshift. I have follo...
1
answers
0
votes
16
views
asked 7 days ago
I am emitting records from lambda function using firehose.put_record_batch function. However firehose raises delivery error even though python record exactly matches the schema of the table. identifie...
1
answers
0
votes
26
views
asked 10 days ago
The JSON data with a nested structure is streamed into FireHose via a direct put over to S3 through a Glue schema that is configured. It has been noticed from querying via Athena or S3 files that only...
1
answers
0
votes
43
views
asked 12 days ago
I have a firehose stream to copy json data from s3 to redshift with a lambda processor in between. The lambda is formatting the data in base64 and redshift is copy succesfully the data into the table...
1
answers
0
votes
29
views
asked 14 days ago
A company has one million users that use its mobile app. The company must analyze the data usage in near-real time. The company also must encrypt the data in near-real time and must store the data in ...
2
answers
0
votes
37
views
asked a month ago
For a specific web service in API Gateway, we send incoming JSON requests to SQS, which then forwards them to an EventBridge event bus. A rule on the event bus directs the messages to Firehose for fur...
2
answers
0
votes
85
views
asked a month ago
Hello, I have a CloudWatch Logs group with 50 log streams that is sending data to a firehose using a subscription filter without any filter pattern. While ingesting the logs that is written from the ...
2
answers
0
votes
50
views
asked 2 months ago
Our Firehose Delivery Stream has ceased sending to Splunk since its configuration has been changed from "manual setup of HEC Token" to "Read from Secrets Manager". We're using "plaintext". Same token...
2
answers
1
votes
93
views
profile picture
asked 2 months ago
I want to export cloud watch log's to s3 with different object key(name) based on log's stream name like "{BucketName}/{LogName}/{StreamName}/{YYYY}/{MM}/{DD}/*.gz" . I know that it is possible by w...
1
answers
0
votes
63
views
asked 3 months ago
With CloudWatch Logs' PutLogEvents, it is possible to pass multiple records at once in a single call. When sending Fargate standard output to CloudWatch Logs, is it possible to pass multiple records a...
1
answers
1
votes
78
views
profile picture
asked 3 months ago
**Resolved**: I discovered that the ELB created to configure the VPC Services cannot have a security group associated with it. I didn’t test with a security group completely open to 0.0.0.0/0 (both in...
3
answers
1
votes
420
views
profile picture
asked 3 months ago
We have been trying Firehose for Iceberg Tables. The source is Kinesis stream attached to DynamoDB tables with some Lambda processing in between. Table has been successfully filled by Firehose, but ...
1
answers
0
votes
277
views
asked 4 months ago