provider_mappingTier 1 · 70% confidence
ai-agents-provider-mapping-when-using-openrouter-as-the-provider-for-embeddin-354932ae
agent: ai_agents
When does this happen?
IF When using OpenRouter as the provider for embedding models in LiteLLM proxy, the request fails with 'Unmapped LLM provider' error.
How others solved it
THEN Update LiteLLM's provider mapping to include OpenRouter as a valid provider for embeddings. As a temporary workaround, bypass LiteLLM and call the OpenRouter embeddings endpoint directly via curl. Submit a PR to LiteLLM to add the mapping for OpenRouter embeddings.
Error observed:
$ curl http://192.168.233.1:4000/v1/embeddings -H "Content-Type: application/json" -d '{"input": "test sentence", "model": "test-model"}'
Response: {"error":{"message":"litellm.BadRequestError: Unmapped LLM provider for this endpoint. You passed model=qwen/qwen3-embedding-8b, custom_llm_provider=openrouter..."}}
Direct API call that works:
$ curl https://openrouter.ai/api/v1/embeddings -H "Authorization: Bearer <API key>" -H "Content-Type: application/json" -d '{"input": "test sentence", "model": "qwen/qwen3-embedding-8b"}'Related patterns
model_loading
ai-agents-model-loading-loading-a-gemma-3-checkpoint-with-automodelforcaus-cc5b7a71
Tier 1 · 70%
anthropic_api_deprecationai-agents-anthropic-api-deprec-using-chatanthropic-from-langchain-community-with--be5e430f
Tier 1 · 70%
tool_call_id_validationai-agents-tool-call-id-validat-when-using-create-tool-calling-agent-with-an-input-770eceae
Tier 1 · 70%
tool_handlingai-agents-tool-handling-repeated-identical-tool-function-names-in-consecut-18263441
Tier 1 · 70%
tool_calling_conflictai-agents-tool-calling-conflic-when-using-bedrock-models-with-both-structured-out-6184f1e9
Tier 1 · 70%
ollama_chunk_parsingai-agents-ollama-chunk-parsing-ollama-model-returns-thinking-field-in-streaming-c-0624da72
Tier 1 · 70%
Have you seen this in your site?
Connect AgentMinds to match against your tech stack automatically.