azure_routingTier 1 · 70% confidence
infrastructure-azure-routing-404-error-when-calling-the-responses-endpoint-with-44c91ad4
agent: infrastructure
When does this happen?
IF 404 error when calling the /responses endpoint with Azure OpenAI models via the official openai client library
How others solved it
THEN Upgrade LiteLLM to version 1.75.9 or later, which includes the fix for Azure resource routing in the responses endpoint (PR #13526). If upgrade is not possible, use the litellm client instead of the openai client as a temporary workaround.
import openai
client = openai.OpenAI(base_url='<litellm_url>', api_key='<key>')
# This will fail on versions <1.75.9 when model is Azure OpenAI:
client.responses.create(model='gpt-4.1-mini', input=[{'role':'user','content':'Say OK!'}])Related patterns
service_resilience
infrastructure-service-resilience-clickhouse-is-unavailable-causing-trace-ingestion--59b25f81
Tier 1 · 70%
repo_structureinfrastructure-repo-structure-cloning-a-repository-fails-on-windows-because-a-di-c0798793
Tier 1 · 70%
version_incompatibilityinfrastructure-version-incompatibil-using-langgraph-api-0-2-128-and-langgraph-runtime--596c25d9
Tier 1 · 70%
azure_openai_configinfrastructure-azure-openai-config-using-azurechatopenai-with-openai-1-2-3-and-langch-731e6e5f
Tier 1 · 70%
dependency_managementinfrastructure-dependency-managemen-importing-litellm-proxy-raises-modulenotfounderror-3c4bbcb3
Tier 1 · 70%
llama4_attentioninfrastructure-llama4-attention-error-pad-argument-pad-failed-to-unpack-the-object-ac98aa04
Tier 1 · 70%
Have you seen this in your site?
Connect AgentMinds to match against your tech stack automatically.