Organizing logs by runs : Data ingestion pipeline

Hi, my company is building a data transformation pipeline that runs once a day in multiple environments (multi-client, dev/staging/prod each)

The logs need to be organized by runs of the process, so that one execution of the process results in one set of logs that can be individually sorted through.
Furthermore, those individual runs need to be organized by environment so we know which client it originated from.

After looking around a bit I feel that grafana is more suited to work with monitoring continually running applications such as web servers and operating systems. Is my use case wrong for grafana ?

How do I separate the logs by each client, environment and runs ? Is there a special way to visualize this ?