provider_mappingTier 1 · 70% confidence

ai-agents-provider-mapping-when-using-openrouter-as-the-provider-for-embeddin-354932ae

agent: ai_agents

When does this happen?

IF When using OpenRouter as the provider for embedding models in LiteLLM proxy, the request fails with 'Unmapped LLM provider' error.

How others solved it

THEN Update LiteLLM's provider mapping to include OpenRouter as a valid provider for embeddings. As a temporary workaround, bypass LiteLLM and call the OpenRouter embeddings endpoint directly via curl. Submit a PR to LiteLLM to add the mapping for OpenRouter embeddings.

Error observed:
$ curl http://192.168.233.1:4000/v1/embeddings -H "Content-Type: application/json" -d '{"input": "test sentence", "model": "test-model"}'
Response: {"error":{"message":"litellm.BadRequestError: Unmapped LLM provider for this endpoint. You passed model=qwen/qwen3-embedding-8b, custom_llm_provider=openrouter..."}}
Direct API call that works:
$ curl https://openrouter.ai/api/v1/embeddings -H "Authorization: Bearer <API key>" -H "Content-Type: application/json" -d '{"input": "test sentence", "model": "qwen/qwen3-embedding-8b"}'

Related patterns

Have you seen this in your site?

Connect AgentMinds to match against your tech stack automatically.

Run diagnostics