Convert log level from upper case to lower case

Hi teams,

I have 2 types of log-level values: uppercase and lowercase.

I used the promtail template to convert all of them into lowercase. But it did not work in my case. Here is my config:

  - match:
      selector: '{source="gitlab"}'
      stages:
        - replace:
            expression: "(severity)"
            replace: "level"
        - json:
            expressions:
              severity: severity
              timestamp: time
              message: message
              level: level
        - logfmt:
            mapping:
              timestamp: ts
              caller: caller
              level: level
              unknown:
        - labels:
            severity:
            message:
            timestamp:
            caller:
            level:
        - template:
            source: level
            template: '{{ ToLower .Value }}'

Could you have a look if have any things wrong in my configuration?

Some examples of logs would be helpful.

Your first replace is strange. logfmt after json is also strange. Without example log it’s hard to tell what they are for. Maybe try:

json → labels → template

Something like this (not tested):

stages:
  - json:
      expressions:
        caller:
        severity: severity
        timestamp: time
        message: message
        level: level
  - labels:
      severity:
      message:
      timestamp:
      caller:
      level:
- template:
    source: severity
    template: '{{ ToLower .Value }}'

thanks for your support @tonyswumac ,

I tried to config as you advised but it still not work wth my case.
I’m trying to analyze the GitLab logs. The logs are collected from many files. That means it has a different format and I have to use both JSON and logfmt. With log-levels, I also have 2 labels: severity and level, which is why I use replace to change severity to level. Could you take a look at this topic to get more about my issue?
Below is examples of my GitLab logs:

==> /var/log/gitlab/gitaly/gitaly_ruby_json.log <==
{"type":"gitaly-ruby","grpc.start_time":"2024-03-25T08:56:16Z","grpc.time_ms":0.163,"grpc.code":"OK","grpc.method":"Check","grpc.service":"grpc.health.v1.Health","pid":450,"correlation_id":"77b22398d0c4810aca6347d9232a2290","time":"2024-03-25T08:56:16.886Z"}

==> /var/log/gitlab/gitlab-exporter/current <==
2024-03-25_08:56:24.60848 127.0.0.1 - - [25/Mar/2024:08:56:24 UTC] "GET /ruby HTTP/1.1" 200 1017
2024-03-25_08:56:24.60853 - -> /ruby
2024-03-25_08:56:24.80958 Passing 'exists?' command to redis as is; blind passthrough has been deprecated and will be removed in redis-namespace 2.0 (at /opt/gitlab/embedded/lib/ruby/gems/2.7.0/gems/sidekiq-6.4.0/lib/sidekiq/api.rb:962:in `block (3 levels) in each')
2024-03-25_08:56:24.83991 127.0.0.1 - - [25/Mar/2024:08:56:24 UTC] "GET /sidekiq HTTP/1.1" 200 74367
2024-03-25_08:56:24.83996 - -> /sidekiq

==> /var/log/gitlab/gitlab-rails/production.log <==
Started GET "/-/metrics" for 127.0.0.1 at 2024-03-25 08:56:25 +0000
Processing by MetricsController#index as HTML
Completed 200 OK in 35ms (Views: 0.5ms | ActiveRecord: 0.0ms | Elasticsearch: 0.0ms | Allocations: 930)

==> /var/log/gitlab/gitlab-rails/production_json.log <==
{"method":"GET","path":"/-/metrics","format":"html","controller":"MetricsController","action":"index","status":200,"time":"2024-03-25T08:56:25.653Z","params":[],"redis_calls":2,"redis_duration_s":0.001154,"redis_read_bytes":406,"redis_write_bytes":200,"redis_cache_calls":2,"redis_cache_duration_s":0.001154,"redis_cache_read_bytes":406,"redis_cache_write_bytes":200,"db_count":0,"db_write_count":0,"db_cached_count":0,"db_replica_count":0,"db_primary_count":0,"db_main_count":0,"db_main_replica_count":0,"db_replica_cached_count":0,"db_primary_cached_count":0,"db_main_cached_count":0,"db_main_replica_cached_count":0,"db_replica_wal_count":0,"db_primary_wal_count":0,"db_main_wal_count":0,"db_main_replica_wal_count":0,"db_replica_wal_cached_count":0,"db_primary_wal_cached_count":0,"db_main_wal_cached_count":0,"db_main_replica_wal_cached_count":0,"db_replica_duration_s":0.0,"db_primary_duration_s":0.0,"db_main_duration_s":0.0,"db_main_replica_duration_s":0.0,"cpu_s":0.037869,"mem_objects":2262,"mem_bytes":975360,"mem_mallocs":4547,"mem_total_bytes":1065840,"pid":2755535,"worker_id":"puma_0","rate_limiting_gates":[],"correlation_id":"ca1e2903-1ee3-4c42-9ba7-757542f9098a","db_duration_s":0.0,"view_duration_s":0.00053,"duration_s":0.03523}

==> /var/log/gitlab/postgres-exporter/current <==
2024-03-25_08:56:25.77858 ts=2024-03-25T08:56:25.777Z caller=log.go:168 level=debug msg="Querying PostgreSQL version" server=/var/opt/gitlab/postgresql:5432
2024-03-25_08:56:25.77886 ts=2024-03-25T08:56:25.778Z caller=log.go:168 level=debug msg="Querying pg_setting view" server=/var/opt/gitlab/postgresql:5432
2024-03-25_08:56:25.78075 ts=2024-03-25T08:56:25.780Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_vacuum
2024-03-25_08:56:25.78208 ts=2024-03-25T08:56:25.782Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stat_activity
2024-03-25_08:56:25.78346 ts=2024-03-25T08:56:25.783Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stuck_idle_in_transaction
2024-03-25_08:56:25.78418 ts=2024-03-25T08:56:25.784Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_postmaster
2024-03-25_08:56:25.78466 ts=2024-03-25T08:56:25.784Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_vacuum_analyze
2024-03-25_08:56:25.78539 ts=2024-03-25T08:56:25.785Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_blocked
2024-03-25_08:56:25.78754 ts=2024-03-25T08:56:25.787Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stat_activity_autovacuum_active
2024-03-25_08:56:25.78850 ts=2024-03-25T08:56:25.788Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stat_database
2024-03-25_08:56:25.80040 ts=2024-03-25T08:56:25.800Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stat_database_conflicts
2024-03-25_08:56:25.80174 ts=2024-03-25T08:56:25.801Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_replication_slots
2024-03-25_08:56:25.80218 ts=2024-03-25T08:56:25.802Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_replication
2024-03-25_08:56:25.80239 ts=2024-03-25T08:56:25.802Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_oldest_blocked
2024-03-25_08:56:25.80315 ts=2024-03-25T08:56:25.803Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_slow
2024-03-25_08:56:25.80386 ts=2024-03-25T08:56:25.803Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stat_activity_autovacuum
2024-03-25_08:56:25.80450 ts=2024-03-25T08:56:25.804Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_total_relation_size
2024-03-25_08:56:25.86575 ts=2024-03-25T08:56:25.865Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stat_replication
2024-03-25_08:56:25.86752 ts=2024-03-25T08:56:25.867Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stat_bgwriter
2024-03-25_08:56:25.86882 ts=2024-03-25T08:56:25.868Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stat_activity_marginalia_sampler
2024-03-25_08:56:25.87144 ts=2024-03-25T08:56:25.871Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_long_running_transactions
2024-03-25_08:56:25.87217 ts=2024-03-25T08:56:25.872Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_long_running_transactions_marginalia
2024-03-25_08:56:25.87324 ts=2024-03-25T08:56:25.873Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_locks
2024-03-25_08:56:25.87395 ts=2024-03-25T08:56:25.873Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_stat_archiver
2024-03-25_08:56:25.87514 ts=2024-03-25T08:56:25.875Z caller=log.go:168 level=debug msg="Querying namespace" namespace=pg_vacuum_queue

==> /var/log/gitlab/gitaly/gitaly_ruby_json.log <==
{"type":"gitaly-ruby","grpc.start_time":"2024-03-25T08:56:27Z","grpc.time_ms":0.19,"grpc.code":"OK","grpc.method":"Check","grpc.service":"grpc.health.v1.Health","pid":449,"correlation_id":"d362b156912b161a8979ca2dfda91df7","time":"2024-03-25T08:56:27.980Z"}

==> /var/log/gitlab/puma/puma_stdout.log <==
{"timestamp":"2024-03-25T08:56:28.291Z","pid":338,"message":"PumaWorkerKiller: Consuming 1979.046875 mb with master and 4 workers."}

==> /var/log/gitlab/gitlab-exporter/current <==
2024-03-25_08:56:30.98379 127.0.0.1 - - [25/Mar/2024:08:56:30 UTC] "GET /database HTTP/1.1" 200 2877
2024-03-25_08:56:30.98385 - -> /database

==> /var/log/gitlab/gitaly/gitaly_ruby_json.log <==
{"type":"gitaly-ruby","grpc.start_time":"2024-03-25T08:56:31Z","grpc.time_ms":0.176,"grpc.code":"OK","grpc.method":"Check","grpc.service":"grpc.health.v1.Health","pid":450,"correlation_id":"1801c7497f1ae882bd39c93660a742b4","time":"2024-03-25T08:56:31.888Z"}

==> /var/log/gitlab/sidekiq/current <==
{"severity":"INFO","time":"2024-03-25T08:56:32.043Z","retry":0,"queue":"cronjob:database_batched_background_migration","version":0,"queue_namespace":"cronjob","args":[],"class":"Database::BatchedBackgroundMigrationWorker","jid":"40aae68f6abfeff7fe5e3137","created_at":"2024-03-25T08:56:32.033Z","meta.caller_id":"Cronjob","correlation_id":"eae6e85d49f9e867e946da27495bdab5","meta.root_caller_id":"Cronjob","meta.feature_category":"database","worker_data_consistency":"always","idempotency_key":"resque:gitlab:duplicate:cronjob:database_batched_background_migration:592d9619e1997b640b70ce6a22f6713bc7793bb7a4e342b7380d90b691fcd6ae","enqueued_at":"2024-03-25T08:56:32.042Z","job_size_bytes":2,"pid":430,"message":"Database::BatchedBackgroundMigrationWorker JID-40aae68f6abfeff7fe5e3137: start","job_status":"start","scheduling_latency_s":0.001147}
{"severity":"INFO","time":"2024-03-25T08:56:32.098Z","retry":0,"queue":"cronjob:ci_update_locked_unknown_artifacts","version":0,"queue_namespace":"cronjob","args":[],"class":"Ci::UpdateLockedUnknownArtifactsWorker","jid":"fd9aade4daf136745129171b","created_at":"2024-03-25T08:56:32.071Z","meta.caller_id":"Cronjob","correlation_id":"2d81c0c1077b67200a82cc816a1b0109","meta.root_caller_id":"Cronjob","meta.feature_category":"build_artifacts","worker_data_consistency":"sticky","wal_locations":{},"wal_location_source":"primary","idempotency_key":"resque:gitlab:duplicate:cronjob:ci_update_locked_unknown_artifacts:912858db94fb8c9da5e144f8394730b16dbb4d7f396606379d4f42078eef6e20","size_limiter":"validated","enqueued_at":"2024-03-25T08:56:32.095Z","job_size_bytes":2,"pid":430,"message":"Ci::UpdateLockedUnknownArtifactsWorker JID-fd9aade4daf136745129171b: start","job_status":"start","scheduling_latency_s":0.003209}
{"severity":"INFO","time":"2024-03-25T08:56:32.101Z","retry":0,"queue":"cronjob:database_batched_background_migration","version":0,"queue_namespace":"cronjob","args":[],"class":"Database::BatchedBackgroundMigrationWorker","jid":"40aae68f6abfeff7fe5e3137","created_at":"2024-03-25T08:56:32.033Z","meta.caller_id":"Cronjob","correlation_id":"eae6e85d49f9e867e946da27495bdab5","meta.root_caller_id":"Cronjob","meta.feature_category":"database","worker_data_consistency":"always","idempotency_key":"resque:gitlab:duplicate:cronjob:database_batched_background_migration:592d9619e1997b640b70ce6a22f6713bc7793bb7a4e342b7380d90b691fcd6ae","enqueued_at":"2024-03-25T08:56:32.042Z","job_size_bytes":2,"pid":430,"message":"Database::BatchedBackgroundMigrationWorker JID-40aae68f6abfeff7fe5e3137: done: 0.058428 sec","job_status":"done","scheduling_latency_s":0.001147,"redis_calls":5,"redis_duration_s":0.003555,"redis_read_bytes":741,"redis_write_bytes":662,"redis_cache_calls":4,"redis_cache_duration_s":0.002321,"redis_cache_read_bytes":731,"redis_cache_write_bytes":274,"redis_queues_calls":1,"redis_queues_duration_s":0.001234,"redis_queues_read_bytes":10,"redis_queues_write_bytes":388,"db_count":1,"db_write_count":0,"db_cached_count":0,"db_replica_count":0,"db_primary_count":1,"db_main_count":1,"db_main_replica_count":0,"db_replica_cached_count":0,"db_primary_cached_count":0,"db_main_cached_count":0,"db_main_replica_cached_count":0,"db_replica_wal_count":0,"db_primary_wal_count":0,"db_main_wal_count":0,"db_main_replica_wal_count":0,"db_replica_wal_cached_count":0,"db_primary_wal_cached_count":0,"db_main_wal_cached_count":0,"db_main_replica_wal_cached_count":0,"db_replica_duration_s":0.0,"db_primary_duration_s":0.011,"db_main_duration_s":0.011,"db_main_replica_duration_s":0.0,"cpu_s":0.00747,"mem_objects":2198,"mem_bytes":308576,"mem_mallocs":583,"mem_total_bytes":396496,"worker_id":"sidekiq_0","rate_limiting_gates":[],"duration_s":0.058428,"completed_at":"2024-03-25T08:56:32.101Z","load_balancing_strategy":"primary","db_duration_s":0.01117}
{"severity":"INFO","time":"2024-03-25T08:56:32.124Z","retry":0,"queue":"cronjob:database_batched_background_migration_ci_database","version":0,"queue_namespace":"cronjob","args":[],"class":"Database::BatchedBackgroundMigration::CiDatabaseWorker","jid":"9384fced79547341d1ab1083","created_at":"2024-03-25T08:56:32.120Z","meta.caller_id":"Cronjob","correlation_id":"485548d273ccef3410d8a28fd7300122","meta.root_caller_id":"Cronjob","meta.feature_category":"database","worker_data_consistency":"always","idempotency_key":"resque:gitlab:duplicate:cronjob:database_batched_background_migration_ci_database:6ba8adee4a8c1e77d2f087a2765c43226ceffa1fd65abc34b95725a7c9abd857","enqueued_at":"2024-03-25T08:56:32.121Z","job_size_bytes":2,"pid":430,"message":"Database::BatchedBackgroundMigration::CiDatabaseWorker JID-9384fced79547341d1ab1083: start","job_status":"start","scheduling_latency_s":0.002537}
{"severity":"INFO","time":"2024-03-25T08:56:32.126Z","retry":0,"queue":"cronjob:ci_update_locked_unknown_artifacts","version":0,"queue_namespace":"cronjob","args":[],"class":"Ci::UpdateLockedUnknownArtifactsWorker","jid":"fd9aade4daf136745129171b","created_at":"2024-03-25T08:56:32.071Z","meta.caller_id":"Cronjob","correlation_id":"2d81c0c1077b67200a82cc816a1b0109","meta.root_caller_id":"Cronjob","meta.feature_category":"build_artifacts","worker_data_consistency":"sticky","wal_locations":{},"wal_location_source":"primary","idempotency_key":"resque:gitlab:duplicate:cronjob:ci_update_locked_unknown_artifacts:912858db94fb8c9da5e144f8394730b16dbb4d7f396606379d4f42078eef6e20","size_limiter":"validated","enqueued_at":"2024-03-25T08:56:32.095Z","job_size_bytes":2,"pid":430,"message":"Ci::UpdateLockedUnknownArtifactsWorker JID-fd9aade4daf136745129171b: done: 0.028077 sec","job_status":"done","scheduling_latency_s":0.003209,"redis_calls":2,"redis_duration_s":0.002103,"redis_read_bytes":213,"redis_write_bytes":446,"redis_cache_calls":1,"redis_cache_duration_s":0.000958,"redis_cache_read_bytes":203,"redis_cache_write_bytes":64,"redis_queues_calls":1,"redis_queues_duration_s":0.001145,"redis_queues_read_bytes":10,"redis_queues_write_bytes":382,"db_count":0,"db_write_count":0,"db_cached_count":0,"db_replica_count":0,"db_primary_count":0,"db_main_count":0,"db_main_replica_count":0,"db_replica_cached_count":0,"db_primary_cached_count":0,"db_main_cached_count":0,"db_main_replica_cached_count":0,"db_replica_wal_count":0,"db_primary_wal_count":0,"db_main_wal_count":0,"db_main_replica_wal_count":0,"db_replica_wal_cached_count":0,"db_primary_wal_cached_count":0,"db_main_wal_cached_count":0,"db_main_replica_wal_cached_count":0,"db_replica_duration_s":0.0,"db_primary_duration_s":0.0,"db_main_duration_s":0.0,"db_main_replica_duration_s":0.0,"cpu_s":0.003313,"mem_objects":1262,"mem_bytes":156424,"mem_mallocs":320,"mem_total_bytes":206904,"worker_id":"sidekiq_0","rate_limiting_gates":[],"duration_s":0.028077,"completed_at":"2024-03-25T08:56:32.126Z","load_balancing_strategy":"primary_no_wal","db_duration_s":0.0}
{"severity":"INFO","time":"2024-03-25T08:56:32.138Z","class":"Database::BatchedBackgroundMigration::CiDatabaseWorker","database":"ci","message":"skipping migration execution for unconfigured database","retry":0}
{"severity":"INFO","time":"2024-03-25T08:56:32.142Z","retry":0,"queue":"cronjob:database_batched_background_migration_ci_database","version":0,"queue_namespace":"cronjob","args":[],"class":"Database::BatchedBackgroundMigration::CiDatabaseWorker","jid":"9384fced79547341d1ab1083","created_at":"2024-03-25T08:56:32.120Z","meta.caller_id":"Cronjob","correlation_id":"485548d273ccef3410d8a28fd7300122","meta.root_caller_id":"Cronjob","meta.feature_category":"database","worker_data_consistency":"always","idempotency_key":"resque:gitlab:duplicate:cronjob:database_batched_background_migration_ci_database:6ba8adee4a8c1e77d2f087a2765c43226ceffa1fd65abc34b95725a7c9abd857","enqueued_at":"2024-03-25T08:56:32.121Z","job_size_bytes":2,"pid":430,"message":"Database::BatchedBackgroundMigration::CiDatabaseWorker JID-9384fced79547341d1ab1083: done: 0.018151 sec","job_status":"done","scheduling_latency_s":0.002537,"redis_calls":1,"redis_duration_s":0.001036,"redis_read_bytes":10,"redis_write_bytes":412,"redis_queues_calls":1,"redis_queues_duration_s":0.001036,"redis_queues_read_bytes":10,"redis_queues_write_bytes":412,"db_count":0,"db_write_count":0,"db_cached_count":0,"db_replica_count":0,"db_primary_count":0,"db_main_count":0,"db_main_replica_count":0,"db_replica_cached_count":0,"db_primary_cached_count":0,"db_main_cached_count":0,"db_main_replica_cached_count":0,"db_replica_wal_count":0,"db_primary_wal_count":0,"db_main_wal_count":0,"db_main_replica_wal_count":0,"db_replica_wal_cached_count":0,"db_primary_wal_cached_count":0,"db_main_wal_cached_count":0,"db_main_replica_wal_cached_count":0,"db_replica_duration_s":0.0,"db_primary_duration_s":0.0,"db_main_duration_s":0.0,"db_main_replica_duration_s":0.0,"cpu_s":0.00289,"mem_objects":1090,"mem_bytes":96088,"mem_mallocs":261,"mem_total_bytes":139688,"worker_id":"sidekiq_0","rate_limiting_gates":[],"duration_s":0.018151,"completed_at":"2024-03-25T08:56:32.142Z","load_balancing_strategy":"primary","db_duration_s":0.0}

All these logs have distinct format, and I would recommend you to use individual pipeline stage for them. For example:

scrape_configs:
  - job_name: gitlab/gitaly-ruby-json
    static_configs:
    - targets:
        - localhost
      labels:
        __path__: /var/log/gitlab/gitaly/gitaly_ruby_json.log
        job: gitlab/gitaly-ruby-json
    pipeline_stages:
      <PIPELINES>

  - job_name: gitlab-puma-stdout
    static_configs:
    - targets:
        - localhost
      labels:
        __path__: /var/log/gitlab/puma/puma_stdout.log
        job: gitlab-puma-stdout
    pipeline_stages:
      <PIPELINES>

Thanks @tonyswumac,
Our Gitlab is a docker container, and we don’t mount the log to the host volume. That is why I can not configure the path for each file. In my configuration, the GitLab log is collected directly from the container.
I found other solution for this:

      - match:
          selector: '{source="gitlab"}'
          stages:
            - replace:
                expression: "(severity)"
                replace: "level"
            - json:
                expressions:
                  level: level
            - labels:
                level:         
            - replace:
                expression: "(INFO)"
                source: "level"
                replace: "info"
            - replace:
                expression: "(DEBUG)"
                source: "level"
                replace: "debug"
            - replace:
                expression: "(WARN)"
                source: "level"
                replace: "warn"

            - replace:
                expression: "(ERROR)"
                source: "level"
                replace: "error"

I still don’t know why this configuration does not work in my case.

I think the source is not correct in this case.
I tried to use replace but it does not work with source:

            - replace:
                expression: "(INFO)"
                source: "severity"
                replace: "info"

Without source, it works.

            - replace:
                expression: "(INFO)"
                replace: "info"

Which source should I use in this case, @tonyswumac ?

You would use source with replace if there is a label previously parsed. Without source you would be replacing the original log stream.

My original recommendation still stands, that you should have different pipelines dealing with different log formats. I don’t have experience with running GitLab in container, but I would double check the Docker logs and see if the log stream comes with any sort of tag you can use to separate the log files.

Thanks @tonyswumac