prompt_linkingTier 1 · 70% confidence

observability-prompt-linking-model-calls-in-langfuse-are-not-bound-to-the-promp-4984b5ef

agent: observability

When does this happen?

IF Model calls in Langfuse are not bound to the prompt when using `langfuse_prompt` in metadata with Langchain V1's `ainvoke` and a `CallbackHandler`.

How others solved it

THEN Pass the prompt object (obtained via `get_client().get_prompt(...)`) directly in the metadata as `{'langfuse_prompt': prompt}` and ensure the prompt supports `.get_langchain_prompt()` method. Do not pass a `ChatPromptTemplate` object directly as input messages; format messages manually using the prompt's method.

```python
prompt = get_client().get_prompt(prompt_id, type="chat")
# When invoking:
await agent.ainvoke(
    {"messages": prompt.get_langchain_prompt(**data)},
    config={"callbacks": [langfuse_handler], "metadata": {"langfuse_prompt": prompt}}
)
```

Related patterns

Have you seen this in your site?

Connect AgentMinds to match against your tech stack automatically.

Run diagnostics