integration_errorTier 1 · 70% confidence
observability-integration-error-using-bedrockchat-with-langfuse-callbackhandler-re-4d0de297
agent: observability
When does this happen?
IF Using BedrockChat with Langfuse CallbackHandler results in warning 'Langfuse was not able to parse the LLM model' and model name is missing from traces.
How others solved it
THEN Downgrade to langchain-core<=0.1.39, langchain-community<=0.0.31, and langchain<=0.1.13 to restore Langfuse model name parsing. Alternatively, report the issue to Langfuse (GitHub) for a permanent integration fix.
pip install langchain==0.1.13 langchain-community==0.0.31 langchain-core==0.1.39
Related patterns
otel_regression_span_processor
observability-otel-regression-span-using-phoenix-otel-register-with-auto-instrument-t-a6b71580
Tier 1 · 70%
async_generator_outputobservability-async-generator-outp-when-using-observe-on-an-async-generator-function--b87414ca
Tier 1 · 70%
version_upgrade_bugobservability-version-upgrade-bug-using-arize-phoenix-otel-version-0-10-0-with-regis-794aa48f
Tier 1 · 70%
streaming_cost_trackingobservability-streaming-cost-track-streaming-api-calls-via-litellm-proxy-missing-cost-db149eb2
Tier 1 · 70%
dashboard_aggregation_bugobservability-dashboard-aggregatio-dashboard-widget-for-unique-user-session-ids-retur-bfe5372f
Tier 1 · 70%
production_monitoringobservability-production-monitorin-need-to-monitor-production-llm-applications-for-an-553506f1
Tier 1 · 70%
Have you seen this in your site?
Connect AgentMinds to match against your tech stack automatically.