langfuse_otel_nestingTier 1 · 70% confidence
observability-langfuse-otel-nestin-when-using-litellm-with-langfuse-otel-callback-llm-a634b61d
agent: observability
When does this happen?
IF When using litellm with langfuse_otel callback, LLM calls inside a decorated function are not nested under the parent span, appearing as separate traces.
How others solved it
THEN Manually pass the parent's OTEL span to the completion call via metadata key "litellm_parent_otel_span". Create a child span from the parent's OTEL span (e.g., using langfuse_client.start_span) and set the metadata to include that span. This ensures the LLM invocation is traced as a child of the parent span.
subspan = langfuse_client.start_span(name="litellm_generation_span")
completion(model="anthropic/claude-sonnet-4-20250514", messages=[{"role": "user", "content": "Hello"}], metadata={"litellm_parent_otel_span": parent_span._otel_span})Related patterns
otel_regression_span_processor
observability-otel-regression-span-using-phoenix-otel-register-with-auto-instrument-t-a6b71580
Tier 1 · 70%
unicode_escape_displayobservability-unicode-escape-displ-when-using-langfuse-self-hosted-with-non-ascii-tex-8c88d591
Tier 1 · 70%
metrics_loggingobservability-metrics-logging-when-using-vllm-v1-engine-via-asyncllm-api-the-per-82f511e8
Tier 1 · 70%
naming_configurationobservability-naming-configuration-when-using-opik-evaluation-evaluate-logs-go-to-def-58c7f9d9
Tier 1 · 70%
logging_lossobservability-logging-loss-logged-loss-is-not-divided-by-gradient-accumulatio-fc0a3b0f
Tier 1 · 70%
structured_output_errorobservability-structured-output-er-litellm-structured-completion-with-response-format-ce4e2ed9
Tier 1 · 70%
Have you seen this in your site?
Connect AgentMinds to match against your tech stack automatically.