Cannot run my lambda due to the hard coded memory limit.

0

I am using a lambda function which I recently extended in functionality. Therefore I needed to add another layer (scipy) in order to run the python code. But when I try to add the scipy layer, the lambda is telling me, that my limit of 262MB is exceeded. My lambda is pollin a API and deriving the received data (spectral data). What would be your suggestions to make the function running?

Thanks and BR Tobias

Tobi
已提问 3 个月前113 查看次数
3 回答
1

Hello.

I have never used "scipy", but what size will it be when layered?
Lambda layers have a limit of 250MB, so if they are larger than this, they cannot be used.
https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-limits.html

I think one of the workarounds for layer size is to use a container image as the runtime.
I think you can use this with somewhat larger layers.
https://docs.aws.amazon.com/lambda/latest/dg/images-create.html#images-types

Also, if there is a memory limit during execution, I think there will be no problem if you increase it from the Lambda settings.
https://docs.aws.amazon.com/lambda/latest/dg/configuration-function-common.html#configuration-memory-console

profile picture
专家
已回答 3 个月前
profile pictureAWS
专家
Uri
已审核 3 个月前
1

As per documention, deployment package limit is 250 MB which is about 262,144,000 bytes . This quota applies to all the files you upload, including layers and custom runtimes. Try removing unnecessary packages,

If you are unable to reduce package size, you can perhaps explore changing from Lambda Python runtime to Lambda container image instead. The code package size limit is 10 GB (maximum uncompressed image size, including all layers)

AWS
专家
Mike_L
已回答 3 个月前
0

If need to process large files or load many large dependencies in AWS Lambda, you can do it by putting them on EFS volume which can be mounted to Lambda.

已回答 3 个月前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则