Hello @penguinfr and welcome!
Sorry for this long post. It was a challenge to prepare a working “template” as an example, but I will try to do so below. My concept was based on the following:
- Export CSV data based upon time period chosen from within Grafana itself
- Get the data directly from the datasource through its API (remember, my example is based upon the use of InfluxDB as a datasource which comes with its own API out of the box).
Tools I used and tested upon:
- Grafana 8.4.3
– plugins I used: InfluxDB datasource (native) & HTML Graphics v2.0.2 (install v2.0.2 manually to work with grafana 8.3.4 and above)
- InfluxDB 2.1.1 (with API v2 and Flux scripting)
NOTE: This has been put together as an experiment and needs a LOT of improvement. I used examples from all around in order to create something basic within Grafana to support this idea.
If your Database of choice doesn’t come with an API, then you could always write your own (e.g. with Nodejs) or query it through a different DB (e.g. like how you can query MySQL data inside of influxDB directly).
Alright with all of that out of the way, let’s get started:
In my example above I’ve created 2 panels in which one of them shows the necessary Graphs for the user to manipulate with the Grafana time picker. When the desired timerange has been selected, you can click on the export button in the other panel which will query the desired data separately and will automatically download it to your downloads-folder.
(reason to query data again separately instead of reusing the Graph panel as a datasource, is to give flexibility to the user to get raw data. This is especially useful when you want Big Data, because Grafana will only “import” data to a certain limit amount thus needing an aggregation to group all data into smaller chunks).
- Create an empty panel and load any InfluxDB datasource along with HTML Graphics as the visualization
- Enter the following flux script in the datasource under the Query tab which will update with each time range and produces the start and stop unix timestamps. We need this to manually query the API for the raw data through JavaScript.
Flux script:
from(bucket: "Winservices_v1")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "win_disk")
|> last()
|> map(fn: (r) =>
{
return {
start: v.timeRangeStart,
stop: v.timeRangeStop
}
})
Note: I’m just using a dummy bucket database here in order to trick InfluxDB for us to use the map function. With this function I can create my own custom fields.
If you have a different database for this, then you could use that one as well if it supports a way to get the start and stop timestamps from the Grafana time picker (e.q. perhaps MySQL and so on).
- Go to the right properties panel in Edit-mode to configure the rest of the HTML Graphics visualization.
Fill the following code windows and leave all others empty:
– HTML/SVG document:
<div id="mySubmitButtons">
<button title="" id="myButton">
<div id="myButtonIcon">💾</div>
<span class="myButtonText">Export to .CSV</span>
</button>
<span id='status'></span>
</div>
– onRender:
var myButton = htmlNode.getElementById("myButton");
var myStartTime = parseInt(data.series[0].fields[0].values.buffer[0] / 1000);
var myStopTime = parseInt(data.series[0].fields[1].values.buffer[0] / 1000);
var queryBody = `
from(bucket: "Winservices_v1")
|> range(start: ${myStartTime}, stop: ${myStopTime})
|> filter(fn: (r) => r["_measurement"] == "win_disk")
|> filter(fn: (r) =>
r["_field"] == "Percent_Disk_Read_Time" or
r["_field"] == "Percent_Disk_Time" or
r["_field"] == "Percent_Disk_Write_Time"
)
|> filter(fn: (r) => r["instance"] == "C:")
|> keep (columns: ["_time", "_field", "_value"])
|> aggregateWindow(every: 1m, fn: mean)
|> filter(fn: (r) => exists r["_value"])
|> pivot(
rowKey:["_time"],
columnKey: ["_field"],
valueColumn: "_value"
)
|> drop (columns: ["_start", "_stop"])
// |> map(fn: (r) =>
// {
// return {
// _time: experimental.addDuration(d: 1h, to: r._time),
// "Percent_Disk_Read_Time": string(v: r["Percent_Disk_Read_Time"]) + " %",
// "Percent_Disk_Time": string(v: r["Percent_Disk_Time"]) + " %",
// "Percent_Disk_Write_Time": string(v: r["Percent_Disk_Write_Time"]) + " %"
// }
// })
`;
if (myButton) {
myButton.setAttribute("onclick", `
//console.log('Button click detected');
function saveBlob(blob, fileName, rawData) {
try {
var uint8array = new TextEncoder("utf-8").encode(rawData);
var string = new TextDecoder().decode(uint8array);
var rawData_cleanedExcessData = rawData.replaceAll(',_result,0,','');
var rawData_cleanedExcessHeaders = rawData_cleanedExcessData.replaceAll(',result,table,','');
var hiddenElement = document.createElement('a');
hiddenElement.href = 'data:text/csv;charset=UTF-8,' + '\uFEFF' + encodeURIComponent(rawData_cleanedExcessHeaders);
hiddenElement.target = '_blank';
hiddenElement.download = fileName;
hiddenElement.click();
Button_status.innerText = "Download Complete!";
} catch (error) {
Button_status.innerText = "CSV Export Error!";
console.log(error);
}
}
this.disabled = true;
var exportButton = this;
var Button_status = this.parentElement.children[1];
var iconButton = this.children[0];
Button_status.innerText = "requesting data...";
var xhttp = new XMLHttpRequest();
xhttp.open("POST", "http://192.168.1.2:8086/api/v2/query?org=MyTestOrganization", true);
xhttp.setRequestHeader("Authorization","Token W1UqlTdl80zMyE7jVMd71h1y6wLrlQMcZ03W4VtH1JfUYkUWPIa8rGP2cm0RB25Jfe7rqEOg6tWEY1t8ghlQfk==");
xhttp.setRequestHeader("Accept","application/csv");
xhttp.setRequestHeader("Content-type","application/vnd.flux");
var myDate = new Date();
var fileName = "Graphs_" + myDate.toISOString() + ".csv";
xhttp.onload = function () {
if (xhttp.readyState == 4 && xhttp.status == "200") {
console.log("Data has been downloaded...");
var blob = new Blob([xhttp.responseText], {type:'text/csv'});
Button_status.innerText = "Exporting to CSV...";
saveBlob(blob, fileName, xhttp.responseText);
iconButton.innerText = "✔";
setTimeout(function(){
iconButton.innerText = '💾';
exportButton.disabled = false;
Button_status.innerText = "";
}, 1000);
} else {
//console.error("myError:",xhttp.responseText);
iconButton.innerText = "❌";
iconButton.style.color = "red";
Button_status.innerText = "Error in download process";
}
}
xhttp.onerror = function () {
alert("Something went wrong!");
iconButton.innerText = "❌";
iconButton.style.color = "red";
console.log(xhttp);
}
iconButton.innerText = "⏳";
xhttp.send(\`${queryBody}\`);
`
)
}
– CSS:
#myButton {
display: inline-flex;
-webkit-box-align: center;
align-items: center;
font-size: 18px;
font-weight: 500;
font-family: Roboto, Helvetica, Arial, sans-serif;
padding: 0px 20px;
height: 48px;
line-height: 46px;
letter-spacing: 0.01071em;
vertical-align: middle;
cursor: pointer;
border-radius: 2px;
background: grey;
color: rgb(255, 255, 255);
border: 1px solid transparent;
transition: background-color 350ms;
}
#myButton:hover {
background: yellow;
color: black;
}
#myButton:active {
background-color: lightyellow;
transform: translateY(2px);
}
#myButtonIcon {
display: inline-block;
vertical-align: middle;
fill: currentcolor;
margin-right: 8px;
margin-left: -12px;
overflow: hidden;
}
.myButtonText {
display: flex;
flex-direction: row;
-webkit-box-align: center;
align-items: center;
white-space: nowrap;
height: 100%;
}
- As for the Graph panel I used a standard Timeseries visualization with an InfluxDB datasource (match this datasource with the query in the JavaScript code).
The flux script:
from(bucket: "Winservices_v1")
|> range(start: v.timeRangeStart, stop: v.timeRangeStop)
|> filter(fn: (r) => r["_measurement"] == "win_disk")
|> filter(fn: (r) => r["_field"] == "Percent_Disk_Read_Time" or r["_field"] == "Percent_Disk_Time" or r["_field"] == "Percent_Disk_Write_Time")
|> filter(fn: (r) => r["instance"] == "C:")
|> aggregateWindow(every: v.windowPeriod, fn: last, createEmpty: false)
|> yield(name: "last")
- That’s it! Tweak this however you want.
@penguinfr as for your 2nd request, I wouldn’t know how to automate this every day / week/ months. I guess you could write this same JavaScript in a NodeJS application or somethin similar in order to automate it as a server or service (much like your github link reference).