Integrating sqs in between alloy and prometheus connection

We were working on a use case where Alloy is writing the data directly to the prometheus endpoint.
This is the default configuration:

prometheus.remote_write “default” {

  • endpoint {*
  •   url = "http://localhost:9009/api/prom/push"*
    
  • }*
    }

Considering the situation, if prometheus is down, we will lose the data for now.

So we were thinking if we could implement a messaging application in between alloy and prometheus?

Do we have an option to use sqs as endpoint URL in the alloy configuration? We can send the data to sqs and then prometheus uses sqs to receive the data.

what would happen if that messaging application is also down. do you see where I am going with this?

If you need SQS before prometheus then I don’t think Alloy will fit your needs, because it supports neither writing nor reading from SQS. It supports reading from Kafka, and although I haven’t personally tried it it seems rigid (need to specify topics, for example).

Unless your metrics are super critical, I’d advocate to not over-complicate things. Alloy does have backoff settings in prometheus remote write, so you can potentially make it hold on to metrics for an hour or two if your remote endpoint is unavailable. And unless you really need historical data on your metrics perhaps consider that.

If you absolutely have to use some sort of messaging platform, unless you want to write your own tools I think Kafka may have better open source support than SQS.