Translating splunk dashboard to grafana dashboards

Hello,
I have below splunk dashboard which is created by querying elasticsearch datasource.
How can I translate this to Grafana? Specially the counts.
index=gni sourcetype=omni:adminportal source=goapi**np.log “Request Info” URL NOT actuator
| stats count(eval(STATUS<300 or STATUS>=400)) as count,
count(eval(STATUS=200 AND PROCESS_TIME<=100)) as success,
count(eval(STATUS>=400 OR (STATUS=200 AND PROCESS_TIME>100))) AS failures,
perc99(PROCESS_TIME) AS perc99_resp_time
| eval success_rate=round(success/count100,2)
| eval raw_error_rate=round(failures/count
100,2) | table success_rate

Welcome

Could you please post a sample es document and what visualization you want to see it as?

Hi @wabmca,

Welcome to the :grafana: community support forums !!

Please review the submission template and include more details:

  • What Grafana version and what operating system are you using?
  • What is your datasource?
  • What visualization panel you are using e.g. time-series, bar chart, histogram etc?
  • What are you trying to achieve?
  • How are you trying to achieve it?
  • What happened?
  • What did you expect to happen?
  • Can you copy/paste the configuration(s) that you are having problems with?
  • Did you receive any errors in the Grafana UI or in related logs? If so, please tell us exactly what they were.
  • Did you follow any online instructions? If so, what is the URL?

Hi, Thank you very much for replying
My datasource is elasticsearch logs.
What I’m trying to do is first query logs and
count(eval(STATUS<300 or STATUS>=400)) as count variable.
count(eval(STATUS=200 AND PROCESS_TIME<=100)) as success another variable,
count(eval(STATUS>=400 OR (STATUS=200 AND PROCESS_TIME>100))) AS failures as another variable,
then calculate percentile like finally show percentile as text or stat panel

Please help me… Thanks in advance.

Asakrb @usman.ahmad ,

  • What Grafana version and what operating system are you using?
    8.0, linux
  • What is your datasource?
    elasticsearch.
  • What visualization panel you are using e.g. time-series, bar chart, histogram etc?
    Stat or text
  • What are you trying to achieve?
    What I’m trying to do is first query logs and
    count(eval(STATUS<300 or STATUS>=400)) as count variable.
    count(eval(STATUS=200 AND PROCESS_TIME<=100)) as success another variable,
    count(eval(STATUS>=400 OR (STATUS=200 AND PROCESS_TIME>100))) AS failures as another variable,
    then calculate percentile like finally show percentile as text or stat panel
  • How are you trying to achieve it?
  • What happened?
    I don’t know how to get counts in variables and finally calculate percentile from variables. Please kindly help.
  • What did you expect to happen?
  • Can you copy/paste the configuration(s) that you are having problems with?
  • Did you receive any errors in the Grafana UI or in related logs? If so, please tell us exactly what they were.
  • Did you follow any online instructions? If so, what is the URL?

@yosiasz kindly please help!

@usman.ahmad , Kindly please help!

Still waiting for you to provide sample data

“Could you please post a sample es document”

We dont have access to your es hence we need to emulate it on our es by you providing us sample data.

Hi @yosiasz
Here is the sample data. I have tons and tons of logs in elastic search.
{“portalEnvironment”:“prod”,“host”:“poohapi2-13-pvfgn”,“event_env”:“prod”,“version”:“0.4.23”,“@version”:“1”,“data_stream.dataset”:“kggd-prod”,“data_stream.namespace”:“default”,“data_stream.type”:“logs”,“appData.request.hoursOfOperation”:false,“appData.request.function”:“Sales”,“appData.request.duration”:2,“appData.request.inputDate”:[2022,10,31],“appData.request.callType”:[“abc_GovtPrograms_Sales_Group_ConnectorClient3_English”],“appData.request.env”:“prod”,“appData.request.unit”:“abc”,“appData.request.segment”:“GovtPrograms”,“appData.request.mediaType”:[“voice”],“appData.timings.duration”:36,“appData.timings.start”:1667242504.509381,“appData.timings.unit”:“ms”,“appData.timings.end”:1667242504.545015,“appData.status”:“200 OK”,“appName”:“poohapi-2”,“namespace”:“pooh-api”,“dc”:“pkr”,“@timestamp”:“2022-10-31T18:55:04.574Z”,“cluster”:“ocp-pkr”}

I have also developed Lucene query to retrieve these logs
data_stream.dataset.keyword:kggd-prod AND appName:poohapi-2 AND appData.status:200 AND appData.timings.duration:<=100

I just want a way to record counts of the logs fetched by query and way to perform calculations on them. Like success percent. fail percent. Total requests etc.

please post a valid es document. Here is how you can capture it

Hi @yosiasz

Please find the json doc below:

{
  "_index": ".ds-logs-kggd-prod-default-2022.10.27-000006",
  "_type": "_doc",
  "_id": "MBVBL4QB31osgnb1Crp3",
  "_version": 1,
  "_score": null,
  "fields": {
    "cluster": [
      "gck-pkr"
    ],
    "appData.request.duration": [
      2
    ],
    "appData.request.inputDate": [
      2022,
      10,
      31
    ],
    "data_stream.dataset.keyword": [
      "kggd-prod"
    ],
    "appData.request.mediaType": [
      "voice"
    ],
    "data_stream.namespace.keyword": [
      "default"
    ],
    "appData.request.function.keyword": [
      "PriorAuth"
    ],
    "version.keyword": [
      "0.4.23"
    ],
    "appData.request.segment": [
      "umRx"
    ],
    "appData.request.env.keyword": [
      "prod"
    ],
    "appData.status.keyword": [
      "200 OK"
    ],
    "appData.timings.end": [
      1667240060
    ],
    "data_stream.type.keyword": [
      "logs"
    ],
    "host": [
      "poohapi2-14-xzs87"
    ],
    "@version": [
      1
    ],
    "appData.request.mediaType.keyword": [
      "voice"
    ],
    "appData.timings.unit": [
      "ms"
    ],
    "portalEnvironment": [
      "prod"
    ],
    "appName": [
      "poohapi-2"
    ],
    "appData.request.unit.keyword": [
      "OPT"
    ],
    "data_stream.namespace": [
      "default"
    ],
    "dc.keyword": [
      "pkr"
    ],
    "appData.request.segment.keyword": [
      "umRx"
    ],
    "appData.timings.start": [
      1667240060
    ],
    "appData.request.env": [
      "prod"
    ],
    "event_env.keyword": [
      "prod"
    ],
    "namespace.keyword": [
      "cl2-pooh-api"
    ],
    "event_env": [
      "prod"
    ],
    "cluster.keyword": [
      "gck-pkr"
    ],
    "version": [
      "0.4.23"
    ],
    "data_stream.type": [
      "logs"
    ],
    "appName.keyword": [
      "poohapi-2"
    ],
    "@timestamp": [
      "2022-10-31T18:15:01.248Z"
    ],
    "appData.request.callType": [
      "OPT_umRx_PriorAuth_ClinicalAppeals"
    ],
    "appData.timings.duration": [
      37
    ],
    "appData.timings.unit.keyword": [
      "ms"
    ],
    "appData.request.unit": [
      "OPT"
    ],
    "data_stream.dataset": [
      "kggd-prod"
    ],
    "appData.request.callType.keyword": [
      "OPT_umRx_PriorAuth_ClinicalAppeals"
    ],
    "appData.request.hoursOfOperation": [
      false
    ],
    "namespace": [
      "cl2-pooh-api"
    ],
    "appData.request.function": [
      "PriorAuth"
    ],
    "portalEnvironment.keyword": [
      "prod"
    ],
    "appData.status": [
      "200 OK"
    ],
    "dc": [
      "pkr"
    ]
  },
  "highlight": {
    "appName.keyword": [
      "@kibana-highlighted-field@poohapi-2@/kibana-highlighted-field@"
    ],
    "data_stream.dataset.keyword": [
      "@kibana-highlighted-field@kggd-prod@/kibana-highlighted-field@"
    ]
  },
  "sort": [
    88485
  ]
}

Kindly please help. Thanks in advance.

1 Like

Cool and now please post how this query visualization (screen shot) looks like in Kibana?

count(eval(STATUS<300 or STATUS>=400)) **as count variable** .
count(eval(STATUS=200 AND PROCESS_TIME<=100)) **as success another variable,**
count(eval(STATUS>=400 OR (STATUS=200 AND PROCESS_TIME>100))) **AS failures as another variable,**

Hi @yosiasz ,

This is Splunk query, It’s not working in kibana. I want to translate it to Grafana.

Oops. Yeah I meant splunk. Can you show us the result in splunk?

Because I want to see how you are converting the http responses into numeric values: 200 OK to 200 in the appData_status field

1 Like

Hi @yosiasz, 1st screenshot is of query visualization. Followed by Dashboard itself.


still does not answer the above question. the data you provided has a text field for http reponse appData_status = “200 OK”

How are you converting that to 200 numeric

Hi @yosiasz ,

I’m not converting it to numeric as I don’t know, I just specified it as
appData_status: 200 Do I need to specify it as appData_status: “200 OK”?

Thanks in advance,
Ahmed

:eyes:

the es data you provided is as follows

,
    "portalEnvironment.keyword": [
      "prod"
    ],
    "appData.status": [
      "200 OK"
    ],

the splunk query you show is

STATUS=200

that does not jive. Anyways, let’s see how others can help to convert this 200 OK string to just 200

Also how many distinct appData.status values could you have?

Hi @yosiasz ,

Splunk is extracting the STATUS and PROCESS_TIME fields as key value pairs. As can be seen in below screenshot of visualization of query in Splunk.

Status has 3 values: 200, 300, 400

Kindly please help… Thanks in advance… Ahmed

1 Like

Hi @yosiasz ,

Do you need any more inputs. Please help me.

Thanks in advance,