By using AWS re:Post, you agree to the AWS re:Post Terms of Use

Concurrently executions from a FIFO queue

0

I have my FIFO queue connected with a Lambda function. When a message is sent to the queue the function takes and processes it. My problem occurs when more than one message is sent to the queue: I understood that with a FIFO queue, basically, all the messages are processed one by one (if they share the same GroupID). This is not what happens to me: when I send more than one message (5 for example) to the queue in few seconds, than the first message goes in flight while the other messages are waiting, but when the first message process is completed then all the other 4 messages go together in flight!! How can this happen if they share the same GroupID and they are is a FIFO queue? I expect that when the first is completed then the second goes in flight, and the other 3 waits and so on!

I think it doesn't depends on the queue setting because I changed all the parameters many times (Visibility time out, content based deduplication and so on). In any case I leave you, below, the screenshots of the parameters setting I now have.

My account has a maximum of 10 concurrent executions and it is the exact number of messages that are in flight together at maximum (I tried to send many messages and what happens is that the first is in flight alone and then ten by ten all the others get in flight concurrently, very strange to me). I would like for each group only one execution at a time, the others must wait for the completion of the one that is processing. I want to manage concurrency by the different groupID I give to the message in the queue. I'm sending messages to the queue through aws-sdk in node.js.

Can someone help me please?

Enter image description here Enter image description here Enter image description here

1 Answer
1
Accepted Answer

I am guessing that you set the buffer size to more than 1 in your Lambda event source mapping. When you use a FIFO queue, you can still read more than more than 1 message from the queue at a time, by the same consumer. The idea is that the consumer knows to process the messages in sequence. If you do not want that, just set the batch size to 1.

profile pictureAWS
EXPERT
answered 2 years ago
profile picture
EXPERT
reviewed 5 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions