Grafana Geomap with Influx2 (flux)

Hi all,
I am new to Influx2 (flux) and am gathering geo location data from home assistant. I would love to use the grafana geomap visualization like I did in the old days with Influx1.
(sorry for formatting)

In influx2 I have:

  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["entity_id"] == "person_iphone_geocoded_location")
  |> filter(fn: (r) => r["_field"] == "Location_str")
  |> aggregateWindow(every: v.windowPeriod, fn: last, createEmpty: false)
  |> yield(name: "last")```


and get rows like the following

`0	unit	Location_str	[11.03826904296875, -11.57730199700904]	2022-08-14T15:14:12.801Z	2022-09-13T15:14:12.801Z	2022-08-14T16:00:00.000Z	sensor	allanas_iphone_geocoded_location	Person iPhone Geocoded Location	HA
0	unit	Location_str	[11.94648627730124, -11.91692146898322]	2022-08-14T15:14:12.801Z	2022-09-13T15:14:12.801Z	2022-08-14T17:00:00.000Z	sensor	allanas_iphone_geocoded_location	Person iPhone Geocoded Location	HA``

the column headers are

```table
LAST
_measurement
GROUP
STRING
_field
GROUP
STRING
_value
NO GROUP
STRING
_start
GROUP
DATETIME:RFC3339
_stop
GROUP
DATETIME:RFC3339
_time
NO GROUP
DATETIME:RFC3339
domain
GROUP
STRING
entity_id
GROUP
STRING
friendly_name
GROUP
STRING
source
GROUP
STRING`

here are a few ways but it would help if you provided sample data from your influxdb as csv ie

city,growth,latitude,longitude,population,rank,state
New York,4.8,40.7127837,-74.0059413,8405837,1,New York
Los Angeles,4.8,34.0522342,-118.2436849,3884307,2,California
Chicago,-6.1,41.8781136,-87.6297982,2718782,3,Illinois
Houston,11.0,29.7604267,-95.3698028,2195914,4,Texas
Philadelphia,2.6,39.9525839,-75.1652215,1553165,5,Pennsylvania
Phoenix,14.0,33.4483771,-112.0740373,1513367,6,Arizona

Thanks, here’s some lines from the CSV

#datatype,string,long,dateTime:RFC3339,dateTime:RFC3339,dateTime:RFC3339,string,string,string,string,string,string,string
#default,last,,,,,,,,,,,
,result,table,_start,_stop,_time,_value,_field,_measurement,domain,entity_id,friendly_name,source
,,0,2022-08-14T20:36:15.245192753Z,2022-09-13T20:36:15.245192753Z,2022-08-14T22:00:00Z,"[11.983123779296875, -11.942136096479985]",Location_str,unit,sensor,person_iphone_geocoded_location,Person iPhone Geocoded Location,HA
,,0,2022-08-14T20:36:15.245192753Z,2022-09-13T20:36:15.245192753Z,2022-08-14T23:00:00Z,"[11.97882080078125, -11.97854114743225]",Location_str,unit,sensor,person_iphone_geocoded_location,Person iPhone Geocoded Location,HA
,,0,2022-08-14T20:36:15.245192753Z,2022-09-13T20:36:15.245192753Z,2022-08-15T00:00:00Z,"[11.97882080078125, 
-11.97854114743225]",Location_str,unit,sensor,person_iphone_geocoded_location,Allana’s iPhone Geocoded Location,HA```
1 Like

Why do you have the geolocation saved as?

[11.97882080078125, -11.97854114743225]

You will need to find a way to split this ‘array’ then try this

import "strings"
owners = from(bucket: "mister")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "unit")
  |> filter(fn: (r) =>  r["_field"] == "friendly_name")  
  |> keep(columns: ["_time", "_value", "_field"])

locations = from(bucket: "mister")
  |> range(start: v.timeRangeStart, stop: v.timeRangeStop)
  |> filter(fn: (r) => r["_measurement"] == "unit")
  |> filter(fn: (r) =>  r["_field"] == "Location_str")  
  |> keep(columns: ["_time", "_value", "_field"])

geo = locations
|> map(fn: (r) => ({ r with geoloc:(string(v: r._value)) }))

x = geo
  |> map(fn: (r) => {
    clean = strings.replace(v: r.geoloc, t: "[", u: "", i: 2)
    final = strings.replace(v: clean, t: "]", u: "", i: 2)
    parts = strings.split(v: final, t: ",")
    return { time: r["_time"], lat: parts[0], lon: parts[1]}
  })

union(tables: [owners, x]) 

there must be a better way !

1 Like

This is amazing, thank you so much! I don’t know why home_assistant decided to put the lat/long into a string array. But I am sure this will help others also.

Regarding the weird coords, that’s just me changing them from the real numbers :slight_smile:
Once again thanks, this appears to be working!!

2 Likes