Enable javascript in your browser for better experience. Need to know to enable it? Go here.
Published : Nov 05, 2025
Nov 2025
Assess ?

OpenInference is a set of conventions and plugins; it’s complementary to OpenTelemetry and designed to observe AI applications. It provides standardized instrumentation for machine-learning frameworks and libraries, which helps developers trace LLM invocations along with surrounding context such as vector store retrievals or external tool calls to APIs and search engines. Spans can be exported to any OTEL-compatible collector, ensuring alignment with existing telemetry pipelines. We previously blipped Langfuse, a commonly used LLM observability platform — the OpenInference SDK can log traces into Langfuse and other OpenTelemetry-compatible observability platforms.

Download the PDF

 

 

 

English | Español | Português | 中文

Sign up for the Technology Radar newsletter

 

 

Subscribe now

Visit our archive to read previous volumes