Only Loki Pod logs are visible in Grafana dashboard

I’ve installed grafana-loki and promtail to nable application log forwarding in my cluster. The following is my loki config file:

auth_enabled: false
common:
  compactor_address: 'loki-backend'
  path_prefix: /var/loki
  replication_factor: 3
  storage:
    s3:
      access_key_id: enterprise-logs
      bucketnames: chunks
      endpoint: logging-minio.shared-system.svc:9000
      insecure: true
      s3forcepathstyle: true
      secret_access_key: supersecret
frontend:
  scheduler_address: query-scheduler-discovery.shared-system.svc.cluster.local.:9095
frontend_worker:
  scheduler_address: query-scheduler-discovery.shared-system.svc.cluster.local.:9095
index_gateway:
  mode: ring
limits_config:
  enforce_metric_name: false
  max_cache_freshness_per_query: 10m
  reject_old_samples: true
  reject_old_samples_max_age: 168h
  split_queries_by_interval: 15m
memberlist:
  join_members:
  - loki-memberlist
query_range:
  align_queries_with_step: true
ruler:
  storage:
    s3:
      bucketnames: ruler
    type: s3
runtime_config:
  file: /etc/loki/runtime-config/runtime-config.yaml
schema_config:
  configs:
  - from: "2022-01-11"
    index:
      period: 24h
      prefix: loki_index_
    object_store: s3
    schema: v12
    store: boltdb-shipper
server:
  grpc_listen_port: 9095
  http_listen_port: 3100
storage_config:
  hedging:
    at: 250ms
    max_per_second: 20
    up_to: 3
table_manager:
  retention_deletes_enabled: false
  retention_period: 0

The following is my promtail config file:

# promtail-config.yaml
server:
  # http_listen_address: 127.0.0.1
  http_listen_port: 9080
  grpc_listen_port: 0
scrape_configs:
  - job_name: kubernetes-pods
    kubernetes_sd_configs:
      - namespaces: {}
      - role: pod
    pipeline_stages:
      - docker: {}
    relabel_configs:
      - source_labels: ['__meta_kubernetes_pod_label_app']
        target_label: 'app'
      - source_labels: ['__meta_kubernetes_pod_label_namespace']
        target_label: 'namespace'
      - source_labels: ['__meta_kubernetes_pod_container_name']
        target_label: 'container_name'
  - job_name: system
    static_configs:
    - targets:
        - localhost
      labels:
        job: varlogs
        __path__: /var/log/*log
  # Add more scrape_configs as needed for different log sources
clients:
  - url: http://loki-gateway.shared-system.svc.cluster.local:80/loki/api/v1/push

But as u can see below i can only see the loki pod logs and only the namespace where i’ve deployed all of this.


Hi. Not 100% sure but I think the job_name: system logs would not have a namespace or pod label. What if you search for {job="varlogs"}?

no i meant for the namespaces and other pods are not showing up for the job kubernetes-pod