- 최신
- 최다 투표
- 가장 많은 댓글
Hello,
Process in Chunks with Status Updates
- Instead of streaming, you can process your long-running task in smaller chunks and return status updates as responses.
Steps:
1. Divide Task into Smaller Chunks
- Break your long task into smaller, manageable steps.
2. Use a Simple Polling Mechanism on the Client-Side
- The frontend can periodically check for updates from the backend.
Store Task Progress Temporarily in a Database
- Use a simple DynamoDB table to store the progress of the task.
For the Example code to implement.
Lambda Function
import { Handler } from 'aws-lambda';
import AWS from 'aws-sdk';
const dynamoDb = new AWS.DynamoDB.DocumentClient();
export const handler: Handler = async (event) => {
const taskId = event.taskId || 'default-task';
// Simulate task processing (e.g., processing part of the task)
const taskProgress = { status: 'In Progress', step: 1 }; // Simulate task step
// Store task progress in DynamoDB
await dynamoDb.put({
TableName: 'TaskProgress',
Item: {
taskId: taskId,
progress: taskProgress,
},
}).promise();
return {
statusCode: 200,
body: JSON.stringify(taskProgress),
};
};
Frontend React - Polling for Task Status
import React, { useState, useEffect } from 'react';
import Amplify, { API } from 'aws-amplify';
function PollingComponent({ taskId }) {
const [taskStatus, setTaskStatus] = useState('Not Started');
useEffect(() => {
const interval = setInterval(async () => {
try {
const response = await API.get('YourApiName', `/task-status/${taskId}`);
setTaskStatus(response.status);
} catch (error) {
console.error('Error fetching task status:', error);
}
}, 5000); // Poll every 5 seconds
return () => clearInterval(interval);
}, [taskId]);
return (
<div>
<h3>Task Status: {taskStatus}</h3>
</div>
);
}
export default PollingComponent;
-
Processes the task in chunks and stores progress in DynamoDB.
-
Polls every few seconds to get the latest task status.
A little more context, my function use api (which are using streaming) for audio processing and llm, the whole thing can take something like 10s while the response could ideally start after just 1 or 2 seconds.
So the reponse cannot be something else than streaming, I perfectly understand your solution but for my case it is not a good one.
About the last part of my message about using lambda-stream, I tested the handler and I obtains the following message:
{ "data": { "myCustomFunction": null }, "errors": [ { "path": [ "myCustomFunction" ], "data": null, "errorType": "Lambda:IllegalArgument", "errorInfo": null, "locations": [ { "line": 2, "column": 3, "sourceName": null } ], "message": "Error while de-serializing lambda response payload. Got: Hello world from Lambda!" } ] }
meaning the stream processing part works (the text "Hello world from Lambda!" is sent in multiple parts) but is checked by some higher levels processing which make it fail, so it might be worth looking in this direction.
Any idea or tips is welcomed ! (higher level code running aws lambda cannot be found anywhere ? I am not afraid of a deep dive but cannot find anything right now).
Anyway thank for your response !