Skip to content

Getting runtime error "The tool result was too large!"

0

The Strands framework has size limits for tool results, and several tools that I built e.g. document analysis tool was exceeding these limits. This caused runtime errors with messages like "The tool result was too large!". I am building custom tools using strands https://github.com/strands-agents/tools

1 Answer
0

When you encounter the error "The tool result was too large!" while working with Strands Agents, this indicates that the output from your custom tools (like your document analysis tool) is exceeding the token limits that the underlying models can process.

To resolve this issue, you can try several approaches:

  1. Increase the maximum token count in your inference configuration. Amazon Nova models often need a larger token output for tool results. You can set this in your inferenceConfig parameter:
inferenceConfig={
"maxTokens": 3000
}
  1. Modify your custom tools to return more concise results. Consider:
  • Summarizing the document analysis output
  • Chunking large responses into smaller pieces
  • Filtering out less relevant information
  • Implementing pagination for large results
  1. Use greedy decoding to improve tool calling success. Set the temperature parameter to 0:
inferenceConfig={
"temperature": 0
}
  1. If you're consistently hitting token limits, you might need to implement a strategy to handle large documents by processing them in smaller chunks or implementing a summarization step before returning results.

Remember that when working with Strands Agents, complex interactions and tool usage can quickly consume tokens, especially with features like Claude 4's interleaved thinking which may use more tokens for reasoning.
Sources
Building Agentic AI app with Strands Agents & MCP | AWS re:Post
Troubleshooting tool calls - Amazon Nova

answered 3 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.