DynamoDB not receiving any data from IoT Core rule, despite success from CloudWatch logs

0

Dear whoever it may concern, we are trying to save our IoT data onto a DynamoDB database.

We followed to the letter using this guide, even to trying "headless" without using our IoT device and instead publish and subscribe in the same environment as like in the documentation guide.

Scouring through the documentations sheets and other users here also in the forums with little to no avail solution what could be the cause. Mind that everything is one-to-one using the before-mentioned guide and double checked, which was for repeating readability sake:

{
  "temperature": 28,
  "humidity": 80,
  "barometer": 1013,
  "wind": {
    "velocity": 22,
    "bearing": 255
  }
}

And the SQL statement...

SELECT temperature, humidity, barometer,
  wind.velocity as wind_velocity,
  wind.bearing as wind_bearing,
FROM 'device/+/data'

Here's the following success log event

{
    "timestamp": "2024-02-06 17:44:48.305",
    "logLevel": "DEBUG",
    "traceId": "XXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXX",
    "accountId": "XXXXXXXXXX",
    "status": "Success",
    "eventType": "StartingRuleExecution",
    "clientId": "iotconsole-XXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXX",
    "topicName": "device/22/data",
    "ruleName": "wx_data_ddb",
    "ruleAction": "DynamoAction",
    "principalId": "XXXXXXXXXXXXX"
}

So question becomes, what is exactly going on the DynamoDB side of things? I've tried deleting, changing parameters of the entries 4 times with every combination of decimal and string. Pushing it with the newly made entry. Added extra permissions to the IAM role until I hit the limit. Remade the IAM role multiple times. Checked the key and field names in the rule action. Triple checked message content and case sensitivity. And swapped between eu-west-1 and eu-north-1 AWS regions even out of desperation.

Thank you

3 Answers
-1
Accepted Answer

For future reference and poor souls Googling to ends of earth like me, trying to trust the built in IoT-Core to DynamoDB will not bear fruit, go try to do it with Lambda instead, using the Boto3 python library and documentation, it's way more reliable and CloudWatch actually tells you if something is wrong.

Prerequisites - make sure that Lambda function has permissions to your DynamoDB! You can reuse the same rule as you would before using the wrong/outdated/borked system documentation.

Here's an example code:

import json
import boto3
import logging
from decimal import Decimal

logger = logging.getLogger()
logger.setLevel(logging.INFO)

dynamodb = boto3.client('dynamodb')
table_name = 'YOUR_DYNDB_TABLE NAME'

def lambda_handler(event, context):
    
    # Initialize the DynamoDB instance
    db =  boto3.resource('dynamodb')
    table = db.Table(table_name)

    
    # Extract the MQTT json message from the event
    primary_key = event.get('PRIMARY KEY')
    secondary_key = event.get('SECONDAR KEY')
    
    logger.info('PRIMARY KEY: :%s', primary_key )
    logger.info('SECONDAR KEY :%s', secondary_key )
    #Other data can be added of however you may need...
    logger.info('OTHER DATA :%s', other_data)
    
    #.json has to dumped and loaded again, in the case you have float data type in the received data, it has to be converted into Decimal!
    jsonObj = json.loads(json.dumps({
        'primary_key ': primary_key ,
        'secondary_key ': secondary_key,
        'other_data': other_data,
        #More data here....
        }), parse_float=Decimal)

    #DynamoDB Insert
    table.put_item(Item=jsonObj)
    
    return {
        ....
    }
answered 3 months ago
  • As per my answer, the rule action works fine. Your Lambda code is correct, but I would not advise using a Lambda for an action that the rules engine can perform, because a Lambda will generally cost more.

-1

Hi. I'm sorry to see that it was such a frustrating experience. I believe I know what the problem is. Step 1 of the tutorial instructs you to create a table with a partition key and sort key both of type Number. Step 2 instructs you to create the rule and states:

In Table name, choose the name of the DynamoDB table you created in a previous step: wx_data. The Partition key type and Sort key type fields are filled with the values from your DynamoDB table.

However, the GUI is failing to fill the key types with NUMBER and instead leaves them at the default of STRING.

Enter image description here

I'm not sure if this is a regression in the front end code, or a deliberate change that requires a change in the tutorial. I've raised an internal issue to pursue it.

In your question, you show the StartingRuleExecution event type. Per the documentation, this only proves that the rule execution commenced, not that the rule thinks it succeeded. If you search your CloudWatch logs for event type RuleExecution, I'm reasonably sure you'll find an error such as:

One or more parameter values were invalid: Type mismatch for key sample_time expected: N actual: S

If you correct the definition of the rule action, the problem is fixed, and you'll see that the DynamoDB rule action works just fine.

Nonetheless, I apologize for the inconvenience. There is a mismatch between the console behaviour and the tutorial instructions. I will get that fixed. Thank you for bringing it to our attention.

profile pictureAWS
EXPERT
Greg_B
answered 3 months ago
  • I had swapped and played around the partition key types before - when I had put them previously by trial and error, and today as STRINGS there was an expected error as you said "Type mismatch for key sample_time expected" in the CloudWatch logs, but even then was no change in the matter after the change into "NUMBER" type (as it was before)- no errors in the type mismatch which was to be expected to be solved, but alas no database entries here nor there. Would setting up CloudTrail in the DynamoDB be useful here to debug and watch what the underlying issue is and what gets lost in process?

  • You say there's no error in CloudWatch. Do you have event type RuleExecution for your DynamodDB action, with status of Success?

-1

If everything is set up correctly according to the guide and the data is being published successfully (as indicated by your success log event), but you're still facing issues on the DynamoDB side, there could be several potential causes:

  1. DynamoDB Table Configuration: Ensure that the DynamoDB table is correctly configured to match the expected data structure. The table's primary key (and sort key, if used) must match the fields you're expecting to insert/update with your rule.

  2. Rule Query Statement: Double-check the SQL statement in your AWS IoT rule. Ensure that it's correctly formatted and matches the data structure you're publishing. Pay special attention to any potential issues with the SQL syntax, such as missing commas or incorrect field references.

  3. Rule Action Configuration: In the rule action that writes to DynamoDB, make sure that the action is properly configured with the correct table name and role ARN. The role must have the necessary permissions to write to the DynamoDB table.

  4. IAM Role Permissions: Since you've mentioned modifying the IAM role, ensure that the role used by AWS IoT to write to DynamoDB has the dynamodb:PutItem permission for the target table. Also, make sure there are no explicit deny statements that could be overriding the allow permissions.

  5. Data Types and Key Constraints: DynamoDB requires that the data types in the payload match the data types defined in the table schema. Additionally, if you're using any sort keys, ensure that the data you're sending includes these keys and that they conform to the expected data types.

  6. Network Configuration: If you're using VPC endpoints or any other network configuration that could affect connectivity between AWS IoT and DynamoDB, verify that these settings are correctly configured and do not block the necessary traffic.

  7. CloudWatch Logs: Check the CloudWatch Logs for both AWS IoT and DynamoDB. These logs might provide more detailed error messages that can help identify the issue. Look for any errors related to rule execution or failed write attempts to DynamoDB.

  8. Testing with Static Data: As a troubleshooting step, try configuring the rule action to write a static piece of data to the DynamoDB table. This can help identify if the issue is with the dynamic data coming from the devices or with the rule action itself.

Given the complexity of AWS configurations and the multitude of places where things might go awry, these suggestions are starting points for deeper investigation.

profile picture
EXPERT
answered 3 months ago
  • Did exact same things - and redid again as I did before with only different thing double-checking the network configuration. There's nothing, but successes in my CloudWatch logs - just disappears after it goes past the rule like it's never existed. Still no entries using built in. With "*" statement, with word for word statements, putting raw publishing values - nothing works - period, giving all possible related permissions to both IoT rule and DynamoDB.

    Got to work with IoT-Core to Lambda to DynamoDB instead. Something is incredibly broken in the backend and I cannot see what Amazon doesn't want people to see.

    For future reference and poor souls trying to trust the built in IoT-Core to DynamoDB, either go S3 or go try do it with Lambda instead, using the Boto3 python library and documentation, it's way more reliable and CloudWatch actually tells you if something is wrong.

    Thanks for the answer, but alas its seemingly borked or the documentation is incredibly outdated on Amazon's end.

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions