How could I pull Loki records from a Python script?

Hello team:
I would like to dig into the Loki records in order to find patterns. For instance, the following sequence of records coming from a LAN switch tells me one switch port is flapping once each 10 seconds:

In cases like this, if the sequence repeats beyond some upper bound of time, I would like to notify the network guys in order to fix the potential issue in the network device.

But I do not want to waste my time looking to the Grafana dashboards. I want this task to be delegated to a program.

I wonder how to address the requirement from a programming point of view. I would have liked to automate this from a Python script but AFAIK there is no Python library from which to call Loki for its records. At least the ones I found (loki-client, python-loki-client) do not work.

How are these tasks being carried out currently?

Any hints will be greatly appreciated
Best regards


you could use the api

Thank you Raymond!

I forgot to say that I am already working with the HTTP API. It seems to be that for the time being this is the way to go.

Best regards, Rogelio

1 Like

I am not so proud of my piece of code, but the following gave me access to the Loki records :slightly_smiling_face:

The following is the payload that will be used as argument of the POST message to Grafana to ask it for the Loki records:

def payload(switch):
argument = {
“queries”: [
“refId”: “A”,
“key”: “Q-f03bc30a-9f33-45a4-8fd0-9d9c3662b890-0”,
“datasource”: {
“type”: “loki”,
“uid”: “JKn0-PDSk”
“editorMode”: “builder”,
“expr”: “{hostname="”+switch+“", loglevel="5"} |= changed state to”,
“queryType”: “range”,
“maxLines”: Loki_lines_to_retrieve,
“legendFormat”: “”,
“datasourceId”: 22,
“intervalMs”: 15000,
“maxDataPoints”: 1485
“from”: “2024-01-08T00:00:00.000Z”,
“to”: “2024-01-08T23:59:59.000Z”,

The following are the other arguments of the POST:


headers = {‘Authorization’: apikey,‘Accept’: ‘aplication/json’,‘Content-Type’: ‘application/json’}


And then the following is the POST executed to gather the actual records:

response=, data=json.dumps(PLD),headers=headers)

Once I have the answer from Grafana, I iterate on the results (provided that the POST returned a 200 OK). the following lines are executed in a loop:
while index >= 0:
Filtered=LOG.split(“Interface “,1)[1]
STATE=Filtered.split(“state to “,1)[1].replace(”\n”,‘’)
(… more lines follow but not shown here…)

It worked quite well, although I would have liked a library like the one I use to ask InfluxDB for its records…


1 Like