I’ve got a python app I am deploying with Docker. For reasons that I can’t explain I can’t just run the Grafana agent in a separate container and give it access to the logs.
I am considering just copying the static agent binary into my python container, and manually run the agent before running my app in the container.
It doesn’t seem like a super optimal way to do it, but wondering what other smarter people do?
I see option one as what you mentioned i.e running agent binary with your app, the other is configuring docker custom logging driver and use loki plugin for that.
Thanks, i considered that, but I also don’t have the means to install a docker driver (since I won’t have access to the docker host). Also, I am using other feature of the agent (like traces)
Since you need other features of agent as well, not only log shipping, I don’t see any other way apart from running the binary together.
You can use opentelemetry python lib and export everything (metrics, logs, traces) via otlp exporter directly from the python app.
OK, so what I just tried was to install the agent into my docker using
apt-get. That appeared to work fine. But when I tried to run it in my container I get the following error:
caller=diskstats_linux.go:267 level=error integration=node_exporter collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
FWIW this is running in a python 3.10 container which is bookworm if I am not mistaken
I’m still stuck here. I know I can export directly from my app without the agent, but I really like the idea of running a separate agent to properly handle the exporting properly and take the load off my app.
What am I missing?
You are missing that you are running in the container, so you don’t have standard OS options - you should disable all those default options (integrations), which don’t make sense in the container, e.g. node exporter in this case.
Yea, I think when I followed that path it looked like (according to the docs). And I am pretty sure I tried explicitly disabling it, but it seemed to ignore it. Maybe I had the config wrong. Let me try it again and see.