IoT Core Rule SQL To TimeStream

1

I have a payload of:

{"telemetry": [{"feed": "Channel 1", "reading": 52.7,"unit": "PSI"},{"feed": "Channel 2", "reading": 36.29,"unit": "PSI"},{"feed": "Channel 3", "reading": 34.7,"unit": "PSI"}]}

I have tried a couple of rules:

1st rule >SELECT get(telemetry,0).feed as feed, get(telemetry,0).reading as reading, get(telemetry,0).unit as unit FROM 'my_topic/#' but with this rule it only gives me the 1st in the array, how would i iterate through the whole array? The length of the array can vary so i cannot hard code it.

2nd rule >SELECT (SELECT VALUE feed FROM telemetry) as feed, (SELECT VALUE reading FROM telemetry) as reading, (SELECT VALUE unit FROM telemetry) as unit FROM 'my_topic/#'

Rule has action to TimeStream with dimensions:

TenantId = TenantId (Hard coded tenant id)

Feed = ${feed} and have also tried ${telemetry.feed}

1st problem: I get Errors at dimensions.1: [Dimension value can not be empty.]

2nd problem: If i keep only 1 dimension which is TenantId, it will insert but i really want "Feed" as a dimension. Moreover, I do not like the value of ["Channel 1","Channel 2","Channel 3"] as a value stored in TimeStream as well as the reading is not inserted when I use the 2nd rule. I very much prefer the 1st rule where each reading has a record.

Any help is much appreciated.

asked 2 years ago777 views
1 Answer
0

Hi,

the AWS IoT Action for Amazon Timestream does not yet support this use case. I would suggest to use a Lambda function to write the data into Timestream. You can invoke the lambda function directly from a rule using a Lambda function action or you can push the data to Amazon Kinesis or Amazon SQS in order to benefit from batching multiple messages and thus reduce write costs (see https://docs.aws.amazon.com/lambda/latest/dg/invocation-eventsourcemapping.html)

By using a Lambda function you can also benefit from multi-measure records to store the values of your measurements. You can read more about the multi-record measure in this blog post https://aws.amazon.com/blogs/database/store-and-analyze-time-series-data-with-multi-measure-records-magnetic-storage-writes-and-scheduled-queries-in-amazon-timestream/ and in the documentation.

AWS
EXPERT
answered 2 years ago
  • Any idea on when this will be supported? This is a trivial use case since most payloads will contain nested objects. To route messages firstly to a Lambda function then to Timestream adds complexity & cost to the architecture, especially if you need to add buffers to prevent message loss.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions