I have some JSON which I am looking to parse with a pipeline stage. I have defined expressions and created a new JSON stage for every nested object/array, however the logs that I am parsing do not have a definite number of objects in an array. It could be 2, it could be 5.
Here is some example JSON
{
"level": "info",
"msg": "This is the message",
"stats": {
"bytes": 12345,
"checks": 1234,
"transferring": [
{
"eta": 1234,
"name": "path/to/file/1"
},
{
"eta": 1234,
"name": "path/to/file/2"
},
{
"eta": 1234,
"name": "path/to/file/3"
},
{
"eta": 1234,
"name": "path/to/file/4"
},
{
"eta": 1234,
"name": "path/to/file/5"
}
],
"transfers": 28
},
"time": "2022-07-28T13:55:47.251259+01:00"
}
Here is my pipeline_stage:
pipeline_stages:
- json:
expressions:
level: level
msg: msg
timestamp: time
stats:
- json:
expressions:
bytes: bytes
checks: checks
transferring:
source: stats
- json:
expressions:
transferring_eta: eta
transferring_name: name
source: transferring
- labels:
level:
msg:
timestamp:
stats:
bytes:
checks:
transferring:
transferring_eta:
transferring_name:
The “bytes” and “checks” key-value pairs get created, as they are nested inside the “stats” value, however I can not create the “transferring_eta” or “transferring_name” key-value pairs.