openllmetry
by
traceloop

Description: Open-source observability for your GenAI or LLM application, based on OpenTelemetry

View traceloop/openllmetry on GitHub ↗

Summary Information

Updated 1 hour ago
Added to GitGenius on August 4th, 2025
Created on September 2nd, 2023
Open Issues/Pull Requests: 374 (+0)
Number of forks: 888
Total Stargazers: 6,855 (+0)
Total Subscribers: 21 (+0)
Detailed Description

traceloop/openllmetry is an open-source project aiming to provide a fully-featured, production-ready OpenTelemetry Collector distribution specifically optimized for observability of Large Language Models (LLMs). While the standard OpenTelemetry Collector is powerful, it often requires significant configuration and tuning to effectively handle the unique characteristics of LLM workloads – namely, high dimensionality, complex context, and the need for specialized analysis. This project simplifies that process, offering a pre-configured and optimized Collector tailored for LLM tracing, metrics, and logging.

At its core, the repository provides Docker images and Kubernetes manifests for a complete OpenTelemetry Collector setup. These aren't just basic images; they include pre-installed receivers, processors, and exporters commonly used in LLM observability. Key receivers include those for popular LLM frameworks like vLLM, LangChain, and LlamaIndex, allowing seamless integration with these tools. Processors are crucial for enriching traces and metrics with LLM-specific context, such as prompt templates, model names, and token usage. Exporters support sending data to a variety of backends, including Traceloop’s own platform (designed for LLM observability), Prometheus, Jaeger, and others. The project emphasizes ease of deployment and minimal configuration, allowing users to quickly get started with LLM observability.

A significant differentiator is the inclusion of specialized processors designed for LLM data. These processors perform tasks like extracting key-value pairs from LLM logs (e.g., prompt, completion, cost), calculating token counts, and adding metadata to traces for better analysis. The `llm-enrichment` processor is particularly important, automatically adding context to spans and events, making it easier to understand the flow of data through an LLM application. Furthermore, the project incorporates processors for redaction of sensitive information, a critical requirement when dealing with user-provided prompts and completions. This focus on LLM-specific processing significantly reduces the effort required to derive meaningful insights from observability data.

The repository also provides extensive documentation and examples. This includes guides on deploying the Collector in various environments (local development, Kubernetes), configuring receivers and exporters, and using the LLM-specific processors. Example configurations demonstrate how to integrate with different LLM frameworks and observability backends. The documentation is geared towards both beginners and experienced OpenTelemetry users, offering clear explanations and practical guidance. The project actively encourages community contributions, with clear guidelines for submitting bug reports, feature requests, and pull requests.

Finally, traceloop/openllmetry isn’t intended to *replace* the standard OpenTelemetry Collector, but rather to *extend* it. It provides a curated and optimized distribution for LLM workloads, building upon the foundation of OpenTelemetry. Users can still customize the Collector further by adding or removing components as needed. The project’s goal is to lower the barrier to entry for LLM observability, enabling developers and operators to gain deeper insights into the performance, cost, and behavior of their LLM applications, ultimately leading to more reliable and efficient systems.

openllmetry
by
tracelooptraceloop/openllmetry

Repository Details

Fetching additional details & charts...