- Newest
- Most votes
- Most comments
Hi,
For your use-case you can consider using the Bedrock Converse API which would allow you to send a document/long text as part of a message and requests that the model describe/summarise the contents of the document.
Please refer this blog Getting started with the Amazon Bedrock Converse API and also the AWS documentation which has examples on how to send document or text.
Hope this is helpful.
Thanks, Rama
If real-time response is not the primary goal, you can use batch inference to save cost and process a lot of requests. The data should be placed in S3. https://docs.aws.amazon.com/bedrock/latest/userguide/batch-inference.html
To include a file attachment in an Amazon Bedrock inference request, you need to send the request as a multipart/form-data request. This allows you to include both the text prompt and the file attachment in the same request.
Relevant content
- asked 3 months ago
- asked 2 months ago
- AWS OFFICIALUpdated 5 months ago
- AWS OFFICIALUpdated 19 days ago
- AWS OFFICIALUpdated 5 months ago
- AWS OFFICIALUpdated 6 months ago