Amazon Bedrock: Prompt Management [promptcache] not permitted error

0

Hi, hoping someone can help me.

I'm trying to create a prompt where my llm will suggest two relevant questions based on the user's data.

However, I'm getting this error when trying to invoke the prompt. The model returned the following errors: Malformed input request: #: extraneous key [promptCache] is not permitted, please reformat your input and try again.

I'm using Llama 70B, but tried Claude as well but get the same error.

Here's my system instructions: Based on the user's question, suggest two relevant and actionable questions that explore the data further by using the following principles to guide your questions:

  1. Relevance: Ensure the questions logically extend the analysis just performed.
  2. Exploration Depth: Include questions that provide additional layers of insight.
  3. Perspective Variety: Offer questions that analyze data from different angles (e.g., time trends, category breakdowns, behavioral insights).
  4. Actionability: Ensure the questions lead to actionable or meaningful insights.

Avoid redundancy by ensure each question explores a unique aspect of the data. Focus on generating questions that encourage the user to dive deeper into patterns, trends, or anomalies in the data. Reframe the questions as two short and concise statements in JSON format.

User message: Question: {{user_question}}

asked 2 months ago129 views
1 Answer
0

The error you're encountering suggests that there's an issue with the input format of your prompt. Specifically, the error message indicates that an extraneous key [promptCache] is present in your request, which is not permitted.

To resolve this issue, you should review your prompt configuration and ensure that you're not inadvertently including a 'promptCache' field in your request. This field is not a standard parameter for prompt invocation in Amazon Bedrock.

Here are some steps you can take to troubleshoot and resolve the issue:

  1. Double-check your prompt configuration in the Amazon Bedrock console. Make sure you're not including any unexpected fields like 'promptCache'.

  2. Ensure that your prompt is properly formatted according to the model's requirements. For Llama 70B and Claude, the prompt structure might be slightly different.

  3. When testing your prompt, use the Prompt Builder in the Amazon Bedrock console. This tool allows you to select a model, set model parameters, write the prompt message, and quickly evaluate the output.

  4. If you're using the API to invoke the prompt, review your API call to ensure it doesn't include any extraneous fields.

  5. Try simplifying your prompt temporarily to isolate the issue. Start with a basic prompt without variables or complex instructions, then gradually add complexity back in.

  6. If the issue persists, you may want to create a new prompt from scratch in the Prompt Management console and transfer your instructions there.

Remember, when using Prompt Management, you typically don't need to include system instructions directly in your API calls. The system instructions should be part of the prompt configuration in the Prompt Management console.

If you continue to experience issues after trying these steps, you may want to contact AWS support for further assistance, as there might be a specific issue with your account or the Prompt Management feature configuration.
Sources
Prompt Management for Amazon Bedrock - AWS
Test a prompt using Prompt management - Amazon Bedrock
Modify a prompt using Prompt management - Amazon Bedrock

profile picture
answered 2 months ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions