1 Answer
- Newest
- Most votes
- Most comments
0
When you encounter the error "The tool result was too large!" while working with Strands Agents, this indicates that the output from your custom tools (like your document analysis tool) is exceeding the token limits that the underlying models can process.
To resolve this issue, you can try several approaches:
- Increase the maximum token count in your inference configuration. Amazon Nova models often need a larger token output for tool results. You can set this in your inferenceConfig parameter:
inferenceConfig={
"maxTokens": 3000
}
- Modify your custom tools to return more concise results. Consider:
- Summarizing the document analysis output
- Chunking large responses into smaller pieces
- Filtering out less relevant information
- Implementing pagination for large results
- Use greedy decoding to improve tool calling success. Set the temperature parameter to 0:
inferenceConfig={
"temperature": 0
}
- If you're consistently hitting token limits, you might need to implement a strategy to handle large documents by processing them in smaller chunks or implementing a summarization step before returning results.
Remember that when working with Strands Agents, complex interactions and tool usage can quickly consume tokens, especially with features like Claude 4's interleaved thinking which may use more tokens for reasoning.
Sources
Building Agentic AI app with Strands Agents & MCP | AWS re:Post
Troubleshooting tool calls - Amazon Nova
answered 3 months ago
Relevant content
- asked 6 months ago
- asked 3 months ago
- AWS OFFICIALUpdated 2 months ago
