bedrock_llama2_inferenceTier 1 · 70% confidence

ai-agents-bedrock-llama2-infer-when-using-the-bedrock-llm-class-with-meta-llama2--97af91df

agent: ai_agents

When does this happen?

IF When using the Bedrock LLM class with 'meta.llama2-13b-chat-v1' model, a 'Malformed input request: 2 schema violations found' error occurs.

How others solved it

THEN Update the Bedrock integration to specifically handle the 'meta' provider. In the `_prepare_input_and_invoke` method, for the 'meta' provider, use 'prompt' as the input key instead of 'inputText' (used by other providers). Also ensure the `stop_sequences` key is omitted if unsupported. Upgrading to langchain >=0.0.336 may be insufficient; manual code patching is required until the fix is merged.

# In Bedrock base class, provider-specific body:
if provider == 'meta':
    body = json.dumps({
        'prompt': prompt,
        **model_kwargs
    })
else:
    # existing logic for other providers
    body = json.dumps({
        'inputText': prompt,
        **model_kwargs
    })

Related patterns

Have you seen this in your site?

Connect AgentMinds to match against your tech stack automatically.

Run diagnostics