Transforming timeseries to data frames

I am working through my first backend plugin and wondering how best to coerce the data received from my api into the data frame expected by the new plugin model.

The below works, however I feel like I am doing a lot of type conversion to make it happen. I have access to modify the api response if needed, but perhaps there is a cleaner way to unmarshal the response into the frames?

Response:

    [
      {
        "id": 546,
        "name": "test_random_number",
        "points": [
          [1621382400000,84.0],
          [1621383300000,78.0],
          [1621384200000,30.0],
          [1621385100000,11.0]
        ]
      }
    ]

I map this to the following struct:

type Response []struct {
    Id        string          `json:"id"`
    Name      string          `json:"name"`
    Points    [][]interface{} `json:"points"`
}

Then, after receiving the response from my api, I create the frames:

frame := data.NewFrame("response",
        data.NewField("time", nil, []time.Time{}),
        data.NewField("values", nil, []float64{}),
)

for _, element := range responseObject[0].Points {
        frame.AppendRow(time.Unix(int64(element[0].(float64)), 0), element[1].(float64))
}

response.Frames = append(response.Frames, frame)

I don’t know how costly the loop & .AppendRow is in the pipeline, but I considered returning the two rows, like this:

{
	"id": 546,
	"name": "test_random_number",
	"time": [1621382400000,1621383300000,1621384200000,1621385100000]
	"values": [84.0,78.0,30.0,11.0]
}

Which would allow me to add the dimensions without the loop:

frame.Fields = append(frame.Fields, 
	data.NewField("time", nil, myTimeArray),
)

frame.Fields = append(frame.Fields,
	data.NewField("values", nil, myValuesArray),
)

:grey_question: What is the best practice here when returning timeseries data to a backend plugin? Is there a better format for my api to return that would ease the transformation to frames? Thanks for any design guidance!

The fields are essentially regular Go slices, so your example is basically same as:

values := []float64{}

for _, element := range responseObject[0].Points {
    values = append(values, element[1].(float64))
}

If you want to avoid the risk of copying the data, you should be able to create the slice using something like make([]float46, 0, len(responseObject[0].Points)) to make sure there’s enough capacity to avoid reallocation.

You could also make the slice into the length you want, with something like:

data.NewField("values", nil, make([]float64, len(responseObject[0].Points))

and then use field.Set(i, element[1].(float64)) inside your loop.

If you feel unsure about what the SDK is doing, you could also construct the slices beforehand, and pass to the data.NewField.

Finally though, if you are worried about performance, you should consider creating a benchmark test for it and compare the different approaches.

1 Like

This topic was automatically closed 365 days after the last reply. New replies are no longer allowed.