I’ve looked it up and it seems that most people get it by making faulty indexes or having mismatching version while upgrading, but I’ve made an index over filebeat which simply maps my local Ubuntu logs. I can see the data and the index on Kibana and everything seems fine to me. If anyone can see something that I’m missing, I would be grateful.
Here are some my indexes that should be insightful:
Looks like you’re using Eleasticsearch v7 and support for that is included starting with Grafana v6.2, see release notes. You should also configure the datasource to use version 7.0+ in Grafana.
Step1: I have created the Metricbeat service.
Step2: I have installed the elasticsearch
Step3: I created the dashboard, but data is not coming into it…
Caused by: org.apache.lucene.queryparser.classic.ParseException: Cannot parse ‘metricset.module:system AND metricset.name:core AND beat.hostname:’: Encountered “” at line 1, column 67.
Seems that’s where i am facing a problem… i thought that yml will pickup that hostname as $host but now i am bit not sure where to configure and also if i just replace it with localhost its not working
I found out that Host variable name defined was not correct i changed it to correct variable… but still i am not getting the data in dashboard… but i am able see the hostname which is from elastic search…
I had the same issue. I was using the Elasticsearch monitoring dashboard in Grafana, and I had set my datasource up to use @timestamp as the timeField.
However, @timestamp isn’t a stored field in the monitoring mapping, so you can’t query by it on elasticsearch. Instead change it to be timestamp, which does get stored and indexed. It all worked for me after that.
Hi @mefraimsson ,
I am facing the same problem with my grafana. I am using the version 7.0+ in grafana and there is a @timestamp in my index.
Here the part of the config file: