Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Published : Nov 05, 2025
NOT ON THE CURRENT EDITION
This blip is not on the current edition of the Radar. If it was on one of the last few editions, it is likely that it is still relevant. If the blip is older, it might no longer be relevant and our assessment might be different today. Unfortunately, we simply don't have the bandwidth to continuously review blips from previous editions of the Radar. Understand more
Nov 2025
Assess ?

OpenInference is a set of conventions and plugins; it’s complementary to OpenTelemetry and designed to observe AI applications. It provides standardized instrumentation for machine-learning frameworks and libraries, which helps developers trace LLM invocations along with surrounding context such as vector store retrievals or external tool calls to APIs and search engines. Spans can be exported to any OTEL-compatible collector, ensuring alignment with existing telemetry pipelines. We previously blipped Langfuse, a commonly used LLM observability platform — the OpenInference SDK can log traces into Langfuse and other OpenTelemetry-compatible observability platforms.

Download the PDF

 

 

 

English | Português 

Sign up for the Technology Radar newsletter

 

 

Subscribe now

Visit our archive to read previous volumes