No data from Elasticsearch?

Hello,
I started using Elasticsearch few weeks ago, and I’m now trying to connect it with Grafana, to plot some data I have.

In particular, if I query Elasticsearch, using:
127.0.0.1:9200/1589301605378test41244/_search

My data looks correct:

`{"took":2,"timed_out":false,"_shards":{"total":1,"successful":1,"skipped":0,"failed":0},"hits":{"total":{"value":50,"relation":"eq"},"max_score":1.0,"hits":[{"_index":"1589301605378late41244","_type":"_doc","_id":"4uzCCXIBGKD-vei1Drzf","_score":1.0,"_source":{"seq":0.0,"latency":0.209,"tx_timestamp":1589301612062,"error":0.0,"@timestamp":"2020-05-12T18:40:12.064322"}},{"_index":"1589301605378late41244","_type":"_doc","_id":"4-zCCXIBGKD-vei1ErwI","_score":1.0,"_source":{"seq":1.0,"latency":0.222,"tx_timestamp":1589301613062,"error":0.0,"@timestamp":"2020-05-12T18:40:13.063141"}},{"_index":"1589301605378late41244","_type":"_doc","_id":"7OzCCXIBGKD-vei1Nbww","_score":1.0,"_source":{"seq":10.0,"latency":0.248,"tx_timestamp":1589301622062,"error":0.0,"@timestamp":"2020-05-12T18:40:22.063404"}},{"_index":"1589301605378late41244","_type":"_doc","_id":"5OzCCXIBGKD-vei1Fbzw","_score":1.0,"_source":{"seq":2.0,"latency":0.206,"tx_timestamp":1589301614062,"error":0.0,"@timestamp":"2020-05-12T18:40:14.063389"}},{"_index":"1589301605378late41244","_type":"_doc","_id":"5ezCCXIBGKD-vei1GbzY","_score":1.0,"_source":{"seq":3.0,"latency":0.183,"tx_timestamp":1589301615062,"error":0.0,"@timestamp":"2020-05-12T18:40:15.063308"}},{"_index":"1589301605378late41244","_type":"_doc","_id":"5uzCCXIBGKD-vei1HbzA","_score":1.0,"_source":{"seq":4.0,"latency":0.207,"tx_timestamp":1589301616062,"error":0.0,"@timestamp":"2020-05-12T18:40:16.063426"}},{"_index":"1589301605378late41244","_type":"_doc","_id":"6OzCCXIBGKD-vei1JbyQ","_score":1.0,"_source":{"seq":6.0,"latency":0.198,"tx_timestamp":1589301618062,"error":0.0,"@timestamp":"2020-05-12T18:40:18.063282"}},{"_index":"1589301605378late41244","_type":"_doc","_id":"6uzCCXIBGKD-vei1Lbxg","_score":1.0,"_source":{"seq":8.0,"latency":0.189,"tx_timestamp":1589301620062,"error":0.0,"@timestamp":"2020-05-12T18:40:20.063016"}},{"_index":"1589301605378late41244","_type":"_doc","_id":"6-zCCXIBGKD-vei1MbxJ","_score":1.0,"_source":{"seq":9.0,"latency":0.233,"tx_timestamp":1589301621062,"error":0.0,"@timestamp":"2020-05-12T18:40:21.063517"}},{"_index":"1589301605378late41244","_type":"_doc","_id":"6ezCCXIBGKD-vei1Kbx3","_score":1.0,"_source":{"seq":7.0,"latency":0.12,"tx_timestamp":1589301619062,"error":0.0,"@timestamp":"2020-05-12T18:40:19.062781"}}]}}`

In particular, I’m looking at the latency and error fields and I’m trying to plot them with respect to time.

If I try to create a visualization with Kibana, I’m able, for instance, to create a line plot.

However, I’m unable to get this data and visualize it with Grafana.

As I’m using Elasticsearch 7.6.2, I set up the data source in this way:
Name: Elasticsearch
URL: http://127.0.0.1:9200
Access: Server (default)
Index name: 1589301605378test41244
Time field name: @timestamp
Version: 7.0+
Max concurrent Shard Requests: 5
Min time interval: 1s (but the result is the same even if I change this field)

When I “Save & Test”, everything is fine and I get the message:
Index OK. Time field name OK.

However, when I try to create any visualization, no data can be gathered:

This is a relevant part of the result if I look at the “Query Inspector” (I cut down the central part, as it was full of “buckets” all with a null “value”:

{
  "request": {
    "url": "api/datasources/proxy/2/_msearch?max_concurrent_shard_requests=5",
    "method": "POST",
    "data": "{\"search_type\":\"query_then_fetch\",\"ignore_unavailable\":true,\"index\":\"1589301605378late41244\"}\n{\"size\":0,\"query\":{\"bool\":{\"filter\":[{\"range\":{\"@timestamp\":{\"gte\":1589301372542,\"lte\":1589304972543,\"format\":\"epoch_millis\"}}},{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}}]}},\"aggs\":{\"2\":{\"date_histogram\":{\"interval\":\"5s\",\"field\":\"@timestamp\",\"min_doc_count\":0,\"extended_bounds\":{\"min\":1589301372542,\"max\":1589304972543},\"format\":\"epoch_millis\"},\"aggs\":{\"1\":{\"avg\":{\"field\":\"latency\"}}}}}}\n"
  },
  "response": {
    "took": 1,
    "responses": [
      {
        "took": 1,
        "timed_out": false,
        "_shards": {
          "total": 1,
          "successful": 1,
          "skipped": 0,
          "failed": 0
        },
        "hits": {
          "total": {
            "value": 0,
            "relation": "eq"
          },
          "max_score": null,
          "hits": []
        },
        "aggregations": {
          "2": {
            "buckets": [
              {
                "1": {
                  "value": null
                },
                "key_as_string": "1589301370000",
                "key": 1589301370000,
                "doc_count": 0
              },
              {
                "1": {
                  "value": null
                },
                "key_as_string": "1589301375000",
                "key": 1589301375000,
                "doc_count": 0
              },
              {
                "1": {
                  "value": null
                },
                "key_as_string": "1589301380000",
                "key": 1589301380000,
                "doc_count": 0
              },
              {
                "1": {
                  "value": null
                },
                "key_as_string": "1589301385000",
                "key": 1589301385000,
                "doc_count": 0
              },
--------------------------  ........... ------------------------
              {
                "1": {
                  "value": null
                },
                "key_as_string": "1589304965000",
                "key": 1589304965000,
                "doc_count": 0
              },
              {
                "1": {
                  "value": null
                },
                "key_as_string": "1589304970000",
                "key": 1589304970000,
                "doc_count": 0
              }
            ]
          }
        },
        "status": 200
      }
    ],
    "$$config": {
      "url": "api/datasources/proxy/2/_msearch?max_concurrent_shard_requests=5",
      "method": "POST",
      "data": "{\"search_type\":\"query_then_fetch\",\"ignore_unavailable\":true,\"index\":\"1589301605378late41244\"}\n{\"size\":0,\"query\":{\"bool\":{\"filter\":[{\"range\":{\"@timestamp\":{\"gte\":1589301372542,\"lte\":1589304972543,\"format\":\"epoch_millis\"}}},{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"*\"}}]}},\"aggs\":{\"2\":{\"date_histogram\":{\"interval\":\"5s\",\"field\":\"@timestamp\",\"min_doc_count\":0,\"extended_bounds\":{\"min\":1589301372542,\"max\":1589304972543},\"format\":\"epoch_millis\"},\"aggs\":{\"1\":{\"avg\":{\"field\":\"latency\"}}}}}}\n"
    }
  }
}

Unfortunately, “value” seems to be always “null” and “doc_count” equal to “0”.

Being still learning how to use Elasticsearch and Grafana, unluckily, I was not able to successfully debug this issue.
Do you know why I’m not getting any data?
Am I doing something wrong in the configuration of Grafana?

As OS, I’m currently using Ubuntu 20.04 LTS on a 64 bit machine.

Thank you very much in advance!

Maybe you need to extend the time range on the upper right to, let say, 6 hours…
See the result…
Or you can see the data in the Kibana using Discovery with the same time range, last 1 hour, what is the result?

1 Like

Thank you very much for your suggestion!
I just discovered that there was some mismatch in the time zone used to timestamp the data which was then sent to Elasticsearch, which ended up to be placed 2 hours in the future.
Unfortunately, I did not notice this problem at the beginning, as I was exploring only time ranges in the past.

If I look at least at 2 hours in the future, everything seems to be fine now.

1 Like