How to find unique values for a particular json field for all the log lines in the entire log

I have a log file with log lines as json objects as shown below:
{“timestamp”: “2024-09-22T18:34:52.910571”, “caller”: “func1”, “symbols”: [“SP500”,“AAPL”], “fields”: [“Close”,“Open”]}
{“timestamp”: “2024-09-22T18:34:53.692384”, “caller”: “func1”, “symbols”: [“SHH”,"SP500’], “fields”: [“Close”,“Low”]}
{“timestamp”: “2024-09-22T18:34:54.237938”, “caller”: “func2”, “symbols”: [“AAPL”, “AMZN”, “IBM”, “MSFT”],“fields”: [“Close”,“Open”]}
and want queries which can do the following from the log file

1.) Get all unique values in the symbols key for the entire log file. i.e to get following output:
“AAPL”, “AMZN”, “IBM”, “MSFT”, “SHH” and "SP500’
2.) Get unique fields for a specific symbol in the entire log file, let’s say get all unique fields associated with ticker ‘SP500’ in the entire log file, should return: “Close”,“Low” and “Open”

How to do this:
For first query, I do this:
{job=“task”}
| json
| regexp "symbols":\s*\[(?P<symbols>.*?)\]
| line_format “{{.symbols}}”

and select table view and select column as symbols. This gives the symbols in each log line. How to aggregate all this and get unique symbols out of this.
Also how to approach the second query.
Thanks

would any of these work?

image

Hi…thanks for replying…These transformations do not work. I can reduce the symbols rows to a single row using Reduce (Mode: Series to rows and Calculations: All unique values).
For e.g, this gives me a single row with list of symbols like:
[ “"AAPL", "MSFT", "GOOGL"”, ""SP500"]
Now, dont know how to break this to get the total count and how to get individual symbols from this so that it can be exported