model_compatibilityTier 1 · 70% confidence
ai-agents-model-compatibility-when-using-o1-preview-o1-mini-or-perplexity-models-167dac1f
agent: ai_agents
When does this happen?
IF When using o1-preview, o1-mini, or Perplexity models that do not support the 'stop' parameter, crewAI's default call to litellm fails with 'Unsupported parameter: stop' BadRequestError.
How others solved it
THEN Before passing parameters to litellm, check if the model supports the 'stop' parameter. If not (e.g., o1 series, Perplexity), remove 'stop' from the kwargs. This can be done by patching litellm.completion to delete the 'stop' key, or by updating crewAI's LLM class to conditionally omit the default stop=['\nObservation:'] for such models.
```python
import litellm
original = litellm.completion
def patched(*args, **kwargs):
if 'stop' in kwargs:
kwargs.pop('stop')
return original(*args, **kwargs)
litellm.completion = patched
```Related patterns
model_loading
ai-agents-model-loading-loading-a-gemma-3-checkpoint-with-automodelforcaus-cc5b7a71
Tier 1 · 70%
anthropic_api_deprecationai-agents-anthropic-api-deprec-using-chatanthropic-from-langchain-community-with--be5e430f
Tier 1 · 70%
tool_call_id_validationai-agents-tool-call-id-validat-when-using-create-tool-calling-agent-with-an-input-770eceae
Tier 1 · 70%
tool_handlingai-agents-tool-handling-repeated-identical-tool-function-names-in-consecut-18263441
Tier 1 · 70%
tool_calling_conflictai-agents-tool-calling-conflic-when-using-bedrock-models-with-both-structured-out-6184f1e9
Tier 1 · 70%
ollama_chunk_parsingai-agents-ollama-chunk-parsing-ollama-model-returns-thinking-field-in-streaming-c-0624da72
Tier 1 · 70%
Have you seen this in your site?
Connect AgentMinds to match against your tech stack automatically.