Any python example of how to send logs to the Grafana Cloud OTLP endpoint

I have got metrics and traces sent to Grafana Cloud OTLP endpoints (/v1/metrics, /v1/traces) succesfully. See feat: Added OpenTelemetry Tracing Support · gshiva/pygptcourse@f84c368 · GitHub

I am trying to add logs to the traces and followed the example of opentelemetry-python/docs/examples/logs/example.py at main · open-telemetry/opentelemetry-python · GitHub with the end point set to /v1/logs.

Only a minor issue now that I listed in the reply.

It is user error. I had set the exporter to grpc protocol (second time I am doing it (facepalm)).

The direct logs sending works now. But attaching it to regular logger in my trace decorator seems to have an issue. IDK.

{“body”:"Failed to export logs batch code: 204, reason: ",“severity”:“ERROR”,“resources”:{“service.name”:“TShirtLauncherControl”,“telemetry.sdk.language”:“python”,“telemetry.sdk.name”:“opentelemetry”,“telemetry.sdk.version”:“1.22.0”},“instrumentation_scope”:{“name”:“opentelemetry.sdk._logs._internal”}}

It is very late for me.

Wanted to update the forum before I sign off.

it is working great actually. As long as I send a warning or error log, it shows up. If someone can tell me how to get the trace log correlation working (an example would be awesome)

Kudos to Grafana team for making the OLTP endpoint available. Saves us so much time setting up, configuring, running, maintaining and monitoring agents.

2 Likes

For an complete example see pygptcourse/src/pygptcourse/otel_decorators.py at e5fb208729bb29ee3d70bf3e7b34e009118bd358 · gshiva/pygptcourse · GitHub and pygptcourse/.env.example at 21fd7da460a431eb4dcc14bc4b740f8ef00c580c · gshiva/pygptcourse · GitHub

The key was following the documentation (OpenTelemetry Logging Instrumentation — OpenTelemetry Python Contrib documentation) for enabling traces in the logs.

Thanks to Grafana and OpenTelemetry team. More examples that don’t rely on Otel collector would be helpful for newbies.

1 Like

Have you looked at otel auto instrumentation and also beyla?

I briefly saw the examples for auto instrumentation. It seemed that they were available only for .NET or Java.

I briefly looked at beyla. It seems to be for HTTP/S/grPC apps. Mine is a command line python app.