I’m developing custom datasource plugin and want to register custom transformations that can also be used for our data from other datasources such as Prometheus, Elastic, etc. We want these transformations to be available even if there is no query to our datasource plugin on the dashboard.
@academo , I’ve played with that and was able to bundle backend datasource plugin inside application plugin, however on the Grafana developers portal I found the following information (Work with nested plugins | Grafana Plugin Tools):
Grafana app plugins can nest frontend data sources together with panel plugins so that you can provide a complete user experience.
Is backend datasource plugin bundled inside application plugin expected and valid set up from Grafana point of view? If yes, does it make sense to update documentation?
Hi @academo,
Currently in documentation I sent before stated that application plugin can nest fronted data sources and panel plugins.
I was able to bundle backend datasource plugin inside application plugin. If it’s ok from Grafana point of view, I would suggest to at least change frontend data sources to the data sources
How were you able to bundle the backend as well? Here’s what I tried that didn’t work:
1 - build the binary, name it to what the datasource plugin expects it to be named, and copy it over into the dist directory either at the top level, or next to the module.js of the nested datasource plugin (neither place worked)
2 - replace the “pkg” folder of my app plugin with the pkg folder of my datasource plugin, then build and run normally
Never mind, I just didn’t have the proper permissions on the binary.
For those wondering, the binary for the bundled datasource plugin should be next to the plugin.json file when built
(so in my case the directory structure looks like this:
dist/
|- plugin.json (for the app plugin)
|- datasources
|—|— my-datasource-plugin
|--------|— plugin.json (with “gpx_my_datasource_plugin” as the value for “executable”)
|--------|— gpx_my_datasource_plugin_linux_amd64 (the exectuable file built by running “mage -v build:linux” in my original datasource plugin)
)
I’m sitll not sure though how we’re supposed to set up the build process though, should I just add a script that moves the binary to the proper folder?