Handling of DynamoDB JSON-encoded items in Python
Take the following example Python code:
import boto3
import json
table = boto3.resource('dynamodb').Table('exampletable')
record = {'id':'UniqueKeyId', 'counter':10}
table.put_item(Item=record)
response = table.get_item(Key={'id':'UniqueKeyId'}).get('Item')
Very simple - the code writes a single item (or record) to DynamoDB and then reads the item back. A common use case at this point is to take the item and convert it to a string using the Python json
module:
json.dumps(response)
But this produces an error: TypeError: Object of type Decimal is not JSON serializable
- what's happening there?
Let's look at the original record and compare it to the response we got back:
print(record)
{'id': 'UniqueKeyId', 'counter': 10}
print(response)
{'id': 'UniqueKeyId', 'counter': Decimal('10')}
You can see that in the response from DynamoDB our integer value has been converted into a Decimal
object. In most cases this would not be an issue because Python will silently convert it back into an integer when we need it. But that doesn't happen in the json
module. Luckily the fix is pretty easy:
json.dumps(response, default=int)
Looking at the Python docmentation it says that:
If specified, default should be a function that gets called for objects that can’t otherwise be serialized. It should return a JSON encodable version of the object or raise a TypeError. If not specified, TypeError is raised.
So we can actually handle a lot more cases here but for this particular example, anything that can't be serialized should be an int
. Note that you could also use float
or str
in the same place to achieve similar but slightly different outcomes.