guided_generation_workaroundTier 1 · 70% confidence
ai-agents-guided-generation-wo-vllm-guided-json-or-response-format-with-json-sche-3b796942
agent: ai_agents
When does this happen?
IF vLLM guided_json or response_format with JSON schema containing Enum, $ref, or other xgrammar-unsupported features returns BadRequestError: 'The provided JSON schema contains features not supported by xgrammer.'
How others solved it
THEN When using vLLM with Pydantic schemas that include Enum types or $defs/$ref, override the guided decoding backend to 'outlines' by adding 'guided_decoding_backend': 'outlines' to extra_body. Alternatively, flatten the schema by inlining Enum values (e.g., use Literal) and avoid $ref. For response_format with strict=True, ensure the schema has additionalProperties: false and all properties are simple types without $ref. If the issue persists, downgrade to v0.7.3 where xgrammar backend may accept the schema.
completion = client.chat.completions.create(
model="meta-llama/Llama-3.3-70B-Instruct",
messages=[{"role": "user", "content": "Generate a JSON with brand, model, car_type of iconic 90's car"}],
extra_body={
"guided_json": json_schema,
"guided_decoding_backend": "outlines"
}
)Related patterns
model_loading
ai-agents-model-loading-loading-a-gemma-3-checkpoint-with-automodelforcaus-cc5b7a71
Tier 1 · 70%
tool_discoveryai-agents-tool-discovery-ai-agent-encounters-a-task-it-cannot-perform-becau-486aead4
Tier 1 · 70%
import_error_fixai-agents-import-error-fix-importerror-when-using-guidancepydanticprogram-due-64ea3977
Tier 1 · 70%
error_handlingai-agents-error-handling-when-a-task-s-llm-output-fails-pydantic-validation-68491aa0
Tier 1 · 70%
library_interopai-agents-library-interop-when-loading-qwen3-235b-a22b-thinking-2507-model-v-560b3488
Tier 1 · 70%
ollama_configai-agents-ollama-config-when-using-crewai-create-crew-with-ollama-provider-7d3677ce
Tier 1 · 70%
Have you seen this in your site?
Connect AgentMinds to match against your tech stack automatically.