ElasticSearch IndexPattern cannot set timezone

Our ES index is daily generated at 00:00 localtime(Beijing +8:00), but ES datasource plugin use UTC to make index list.

If queried data is between 00:00~08:00 localtime(Beijing +8:00) today, queried index will be yesterday because plugin use UTC to make index list.

Could you please add timezone select into ES datasource setting?

export class IndexPattern {
  private dateLocale = 'en';

  constructor(private pattern: any, private interval?: string) {}

  getIndexForToday() {
    if (this.interval) {
      return toUtc()           <============================== here utc "hard coded"
        .locale(this.dateLocale)
        .format(this.pattern);
    } else {
      return this.pattern;
    }
  }

  getIndexList(from: any, to: any) {
    if (!this.interval) {
      return this.pattern;
    }

    const intervalInfo = intervalMap[this.interval];
    const start = dateTime(from)
      .utc()    <=================================== here utc "hard coded"
      .startOf(intervalInfo.startOf);
    const endEpoch = dateTime(to)
      .utc()   <==================================== here utc "hard coded"
      .startOf(intervalInfo.startOf)
      .valueOf();
    const indexList = [];

    while (start.valueOf() <= endEpoch) {
      indexList.push(start.locale(this.dateLocale).format(this.pattern));
      start.add(1, intervalInfo.amount);
    }

    return indexList;
  }
}

You need to change your data ingestion of the ES index to UTC also, then all the system will follow the UTC and Grafana will display as your browser localtime.
If you use logstash just add:

timezone => "Asia/Beijing"

Thanks for reply, but es index generation is done by other department, it is hard to make any change. If es plugin support timezone select in index pattern, that would be an economical solution.

For your information, timestamp is the crucial information for timeseries database, so using the standard UTC is the best practice to get the correct data result, even the OS timedate using UTC and present using local time. So, if the whole ecosystem just changes for little incorrect implementation, it will be chaotic for another part of the world.
BTW, it’s not grafana problem, it’s Elasticsearch data ingestion does…

That’s my 5 cents…

Regards,
Fadjar340

1 Like

example here:

UTC 2020-11-25 16:00:00
“my-es-index-2020-11-26” generated, because it is already 2020-11-26 00:00:00 Beijing Time, so all data is writed into “my-es-index-2020-11-26” after UTC 2020-11-25 16:00:00.

UTC 2020-11-25 18:10:15
In grafana you launch a es query “last 1 hour”, grafana ES plugin will generate index pattern list according to UTC, that is “my-es-index-2020-11-25”. Actually, all data is writed into “my-es-index-2020-11-26” after UTC 2020-11-25 16:00:00. So you got nothing.

Problem is grafana ES plugin hard coded UTC in following file:
grafana/public/app/plugins/datasource/elasticsearch/index_pattern.ts

export class IndexPattern {
  private dateLocale = 'en';

  constructor(private pattern: any, private interval?: string) {}

  getIndexForToday() {
    if (this.interval) {
      return toUtc()           <============================== here utc "hard coded"
        .locale(this.dateLocale)
        .format(this.pattern);
    } else {
      return this.pattern;
    }
  }

  getIndexList(from: any, to: any) {
    if (!this.interval) {
      return this.pattern;
    }

    const intervalInfo = intervalMap[this.interval];
    const start = dateTime(from)
      .utc()    <=================================== here utc "hard coded"
      .startOf(intervalInfo.startOf);
    const endEpoch = dateTime(to)
      .utc()   <==================================== here utc "hard coded"
      .startOf(intervalInfo.startOf)
      .valueOf();
    const indexList = [];

    while (start.valueOf() <= endEpoch) {
      indexList.push(start.locale(this.dateLocale).format(this.pattern));
      start.add(1, intervalInfo.amount);
    }

    return indexList;
  }
}

I have more than 5 ES clusters and more than 10 instance of Grafana, and I don’t have any problem with the time stamp, even my timezone +7, because I’ve followed the proper implementation of timezone, in the logstash especially, to make correct value of the timestamp. Also I have client that have different time zone and all the data can seen as the same per local time.

If you saw the network devices, all of them using UTC to save the log, or even SNMP data, so ES get the data as it is and using the best practice.

Grafana also have options to change the timezone as your choice.
I suggest, talk to the developer of the ES ingestion to make it proper configuration as ES guidelines.
If not, you just have messy results… :slight_smile:

Fadjar340