Hi! I’m developing a custom backend datasource plugin using springboot.
So far I was only writing one query result to one DataFrame, writing one arrow record batch to one arrow file and it worked as it should. Now when I try two queries it doesn’t work which is normal since I still didn’t add support for more queries.
After inspecting results from TestData DB with multiple queries it looks to me like a DataFrame is being made for each query.
So if we have a DataFrame for every query, should I write one record batch per frame, or should they all go in one batch? Also should it all be written to one Arrow file and then read, or to separate files(per batch/frame)?
I’m a bit stuck here and the lack of information on this topic doesn’t really help, so any help is appreciated!
The Arrow format is typically used by Grafana for internal communication. For example, the Go SDK abstracts this away for you.
The reason why you’re not finding much information is likely because there’s no official support for building backend plugins in languages other than Go.
While I don’t know enough about the encoding to give you any specific examples, generally:
Grafana makes a data request that can return multiple data responses, one for each user-defined query.
Grafana knows which data response belongs to which query using a refId property.
Each data response can in turn contain multiple data frames.
For example, if the user defines two queries A and B, which would result in two data responses. Each data response would potentially contain several data frames, one for each time series returned by the query.
Okay I think that this will solve or at least get me on the right way, because I was making only one DataResponse no matter how many queries. I will post an update, thanks!