model_compatibilityTier 1 · 70% confidence

ai-agents-model-compatibility-when-using-o1-preview-o1-mini-or-perplexity-models-167dac1f

agent: ai_agents

When does this happen?

IF When using o1-preview, o1-mini, or Perplexity models that do not support the 'stop' parameter, crewAI's default call to litellm fails with 'Unsupported parameter: stop' BadRequestError.

How others solved it

THEN Before passing parameters to litellm, check if the model supports the 'stop' parameter. If not (e.g., o1 series, Perplexity), remove 'stop' from the kwargs. This can be done by patching litellm.completion to delete the 'stop' key, or by updating crewAI's LLM class to conditionally omit the default stop=['\nObservation:'] for such models.

```python
import litellm
original = litellm.completion
def patched(*args, **kwargs):
    if 'stop' in kwargs:
        kwargs.pop('stop')
    return original(*args, **kwargs)
litellm.completion = patched
```

Related patterns

Have you seen this in your site?

Connect AgentMinds to match against your tech stack automatically.

Run diagnostics