No date field named @timestamp found - elasticsearch

I’m new to Grafana and I’ve encountered a problem while trying to create my first Elasticsearch data source. This is the issue I’ve encountered

I’ve looked it up and it seems that most people get it by making faulty indexes or having mismatching version while upgrading, but I’ve made an index over filebeat which simply maps my local Ubuntu logs. I can see the data and the index on Kibana and everything seems fine to me. If anyone can see something that I’m missing, I would be grateful.

Here are some my indexes that should be insightful:

I would put a picture of my properties for the index too, but I’m not allowed as a new user. Although I can see that timestamp is a field in my index:

“properties” : {
@timestamp” : {
“type” : “date”

Hopefully, someone can help. If you need more info please let me know. Thanks.

Looks like you’re using Eleasticsearch v7 and support for that is included starting with Grafana v6.2, see release notes. You should also configure the datasource to use version 7.0+ in Grafana.


thanks, this worked well.

Hey i was not able to fix this issue can you help me the steps that made you to resolve this issue?

You must to provide more info about your problem :wink:

Step1: I have created the Metricbeat service.
Step2: I have installed the elasticsearch
Step3: I created the dashboard, but data is not coming into it…

i am using this link and output is showing like

Caused by: org.apache.lucene.queryparser.classic.ParseException: Cannot parse ‘metricset.module:system AND AND beat.hostname:’: Encountered “” at line 1, column 67.

According to query parser, it is invalid. Which is your complete query?

metricset.module:system AND AND beat.hostname:$Host

This is the one i am using it…

And $Host variable has a value? because, according to the error it seems empty.

Seems that’s where i am facing a problem… i thought that yml will pickup that hostname as $host but now i am bit not sure where to configure and also if i just replace it with localhost its not working

I found out that Host variable name defined was not correct i changed it to correct variable… but still i am not getting the data in dashboard… but i am able see the hostname which is from elastic search…

I am facing the same issue.
I am using Grafana 7.0.0 and ElasticSearch 7.4 (on AWS).

1 Like

I had the same issue. I was using the Elasticsearch monitoring dashboard in Grafana, and I had set my datasource up to use @timestamp as the timeField.

However, @timestamp isn’t a stored field in the monitoring mapping, so you can’t query by it on elasticsearch. Instead change it to be timestamp, which does get stored and indexed. It all worked for me after that.

Hi @mefraimsson ,
I am facing the same problem with my grafana. I am using the version 7.0+ in grafana and there is a @timestamp in my index.
Here the part of the config file:

filter {
  csv {
      separator => ";"
	  columns => ["date", "time", "tenant", "message", "responsetime"]
      skip_header => "false"
	skip_empty_columns => true
  mutate {
 add_field => {
	"timestamp" => "%{date},%{time}"
 date {
match => [ "timestamp", "dd.MM.yyyy','HH:mm" ]
target => "@timestamp"

and a log from Kibana:

A little bit late, I was facing the same problem with my index named exceptions-* changed to [exceptions-]*

1 Like