Hello! From the Cloud docs it seems that the remote_write protocol is the primary method to push metrics into a Prometheus instance in Grafana Cloud.
However, say I have Prometheus metrics endpoints that are available over the Internet over HTTPS – is there a super simple way from within Grafana Cloud Portal so that I can get those scraped periodically?
Hi @jgehrckevdgrafana !
You can connect to the self-managed Prometheus metrics endpoint by adding that as a data source in your Grafana Cloud environment:
This does not scrape the data however you can still query and visualize the results in your hosted Grafana.
Alternatively, the Grafana Cloud Pro and Advanced plans include a feature called “Recorded Queries” which allows you to create a timeseries from point-in-time data, such as some of our API plugins.
You could use this feature for the self-managed Prometheus data source however I think the first option is likely a better approach unless you prefer for the data to be stored in your Grafana Cloud’s hosted Prometheus instance. If that is the case though, perhaps you can just use the Grafana Agent or HTTP write?
This does not scrape the data
Well this is not really helping then!
however you can still query and visualize the results in your hosted Grafana.
No, when the endpoint is not periodically scraped (to fill a timeseries database) then there can also be no timeseries database to query from.
It’s indeed a surprise that you peeps have no UI where we can simply add an existing Prom metrics endpoint : ) I can rather vividly imagine that would be a use case driving people into Grafana Cloud! JP
I understand that I can run the agent to do the scrape&push – but then I need to manage an environment to run this agent in.
Hi Jake, Just a fellow engineer sharing my two cents on this.
Security : I believe it’s a big reason why no company does scrape of your target and asks you to push metrics. It’s very unlikely that I want to expose my service’s endpoint to the public so that a Cloud vendor can scrape it over the Internet.
You as a user need to adjust your firewalls, WAF etc to make this work successfully.
I also think scraping data from public endpoints may not be practical for large-scale deployments from a scalability perspective.
While it’s possible for small deployments that expose a couple of metrics and let the cloud scrape it, mostly company won’t implement it since no big customer would ever allow this.