1 Answer
- Newest
- Most votes
- Most comments
0
Hello.
As far as I can see from the document below, I think the Max output is currently fixed at "4096".
So I don't think the output token can be higher than 4096 at the moment.
https://docs.anthropic.com/en/docs/models-overview#model-comparison
Relevant content
- asked 5 days ago
- AWS OFFICIALUpdated a year ago
- AWS OFFICIALUpdated 2 years ago
- AWS OFFICIALUpdated 5 months ago
Hi HMG Tasha, to verify it, you can always use a very long string as prompt and invoke LLM with it: you will get an ValidationException error giving you the maximum authorized length in the message.