Update records into salesforce from AWS

0

Hello,

We are trying to update records in salesforce with some data, Intially we tried to use the python library called "simple-salesforce" in bulk update mode in aws lambda but everything was working fine but when we are thousands of records coming into the sqs queue and lambda is processing them it was failing with an error ""CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:HDFC_DASH_LeadTrigger: System.LimitException: Too many queueable jobs added to the queue: 51".

So most of the time updating into salesforce is failing and we integrated aws appflow. Right now data is inserted from sqs into s3 bucket as csv by a lambda function and aws appflow runs every 5 minutes in increamental basic using bulk mode it will update the records appflow if able to handle a little more records than simple-salesforce python library but it is still failing at times of high messages with the same error which is mentioned above.

Is there anything I can do to overcome this error, or is there any AWS service or anyway to update records into salesforce rather than these two options which would not fail upon thousands of requests?

1 Answer

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions