Langtrace contributes to the official OpenTelemetry Instrumentation
Karthik Kalyanaraman
⸱
Cofounder and CTO
Nov 7, 2024
By Karthik Kalyanaraman, Cofounder & CTO and Ali Waleed, Software Engineer
Introduction
For the past several months, our team at Langtrace has actively participated in OpenTelemetry's (OTEL) GenAI Special Interest Group(SIG) meetings. The GenAI SIG within OTEL provides guidance on the semantic conventions to adopt when tracing GenAI applications using OTEL standards. This SIG includes core members from Microsoft, Elastic, Google, Traceloop, Honeycomb, and several other companies.
From day 1, our vision has been to build and influence OTEL standard tracing for GenAI applications. As part our ongoing commitment to open source and open telemetry, 2 months ago we started developing a tracing library on the official OpenTelemetry repository to support the OpenAI python package.
Today, we are thrilled to announce that the first official OpenAI instrumentation package that is part of the official Open Telemetry contrib repository is live.
What does this mean for developers
With this package, you can now set up tracing for your OpenAI based AI applications by simply installing the package and sending the traces to the observability tool of your choice. Just run the following pip command to install it.
This means, you can trace key metrics and metadata like prompts, generations, token counts, latency and other inference metadata from any application that uses the OpenAI's python package for making inference calls. Note that the OpenAI python package not only supports OpenAI's family of models but also other open source models and model providers such as Llama, Mistral, Groq, Grok etc.
And because the tracing happens in an open telemetry standard fashion, these traces can be ingested and visualized not only on Langtrace but on any OpenTelemetry native observability solutions like Elastic, Grafana, Signoz, Datadog & IBM Instana. This truly democratizes GenAI observability for the entire developer community that mostly already use OTEL standard tracing tools.
Road Ahead
While this is just a start, we are actively making progress towards coming up with semantic conventions not only for LLM clients but also for VectorDBs and Agentic frameworks. In the meantime, if you are looking for OTEL native tracing for other popular LLM providers, vectorDBs or frameworks, check out Langtrace. We support native tracing for over 30 popular providers.
If you are a developer who is passionate about open source observability or OpenTelemetry, we encourage you to join our weekly calls to shape the future of tracing AI applications built using LLM, VectorDBs or Agentic frameworks. Additional details can be found below.
Links
GenAI semantic conventions - Standard attributes set by the OTEL GenAI committee.
Useful Resources
Getting started with Langtrace https://docs.langtrace.ai/introduction
Langtrace Github https://github.com/Scale3-Labs/langtrace
Ready to deploy?
Try out the Langtrace SDK with just 2 lines of code.
Want to learn more?
Check out our documentation to learn more about how langtrace works
Join the Community
Check out our Discord community to ask questions and meet customers