Visualization using a field which is in MB vs TimeTaken aggregated based on hostname and clonename

Hi,
I am using Beats, Logstash, Elasticsearch for log shipping (beats), parsing (logstash) and storing in elasticsearch.
my sample log is as follows.
[12/05/2022#22:00:00]#{UUID}#Hostname#Clonename#RequestType(I,O)#Apimethod#documet size#timetaken.

Trying to create visualization based on the document size and time taken.
size will be in float. TimeTaken is in millisecods.
Data will be coming from multiple hosts with multiple clone names. Doing Terms aggregation based on the hostname and clone.
Can you please suggest how to create time series visualization for each document size how much time its taking.

Welcome

So your data is # delimited? or is this just as an example? Please provide near to real data samples?

@yosiasz
Thanks for reply.
my data is # delimitted. please find the below data in message field.
[12/05/2022#22:31:48:265] # [da955ab3-c809-4cf6-ad2c-df3a0209def3]# #DEBUG#test#-#-#-#69920###O(RequestType)#Add(API Method)#JVMName#JVM1#size#78787#status#S##TimeTaken#600
[12/05/2022#22:31:48:265] # [da955ab3-c899-4de6-ad2c-df3a0209def3]# #DEBUG#test#-#-#-#69920###O(RequestType)#Add(API Method)#JVMName#JVM2#size#789#status#S##TimeTaken#900
in my log size is in bytes so converting to MB in logstash.
TimeTaken in milliseconds

So what is this ?
[12/05/2022#22:31:48:265] date time delimited with #, inconsistent to your definition?

What kind of visualization do you want to use?

@yosiasz
We want to understand each request is taking how much time for the API (Add). Size of the document will be in MB and how much time its taking to process.
Request would have come to my server at this time [12/05/2022#22:31:48:265] and would have completed at [12/05/2022#22:31:48:365] , diffference is captured in Timetaken column. time taken will have value of 100. so trying to get visualization what is the size of the document and corresponding time taken.

1 Like

something like this?

using infinity plugin, delimiter # but you got issues with your data if you want to something with the date time field

[12/05/2022#22:31:48:365]

because it already has a # . To sort that out you might have to do some regexing. Unless there is a specific grafana plugin for this type of data. Not sure even what kind of standard this data is?