Hello,
I am attempting to connect Grafana to use Loki as a data source.
I am using the Loki-distributed helm chart in a k8s cluster. As far as i can tell, everything is working on write side of the pipeline. The only issue I have is connecting to the read part of the process.
grafana ui returns the following error:
Loki: Internal Server Error. 500. rpc error: code = Unavailable desc = name resolver error: produced zero addresses
I input the url loki-loki-distributed-query-frontend:3100
into the HTTP URL field and receive the following logs in debug mode:
k logs pod/loki-loki-distributed-querier-0
2021-12-17 20:39:26.076078 I | proto: duplicate proto type registered: purgeplan.DeletePlan
2021-12-17 20:39:26.076127 I | proto: duplicate proto type registered: purgeplan.ChunksGroup
2021-12-17 20:39:26.076133 I | proto: duplicate proto type registered: purgeplan.ChunkDetails
2021-12-17 20:39:26.076137 I | proto: duplicate proto type registered: purgeplan.Interval
2021-12-17 20:39:26.098305 I | proto: duplicate proto type registered: grpc.PutChunksRequest
2021-12-17 20:39:26.098322 I | proto: duplicate proto type registered: grpc.GetChunksRequest
2021-12-17 20:39:26.098325 I | proto: duplicate proto type registered: grpc.GetChunksResponse
2021-12-17 20:39:26.098328 I | proto: duplicate proto type registered: grpc.Chunk
2021-12-17 20:39:26.098330 I | proto: duplicate proto type registered: grpc.ChunkID
2021-12-17 20:39:26.098333 I | proto: duplicate proto type registered: grpc.DeleteTableRequest
2021-12-17 20:39:26.098336 I | proto: duplicate proto type registered: grpc.DescribeTableRequest
2021-12-17 20:39:26.098339 I | proto: duplicate proto type registered: grpc.WriteBatch
2021-12-17 20:39:26.098342 I | proto: duplicate proto type registered: grpc.WriteIndexRequest
2021-12-17 20:39:26.098345 I | proto: duplicate proto type registered: grpc.DeleteIndexRequest
2021-12-17 20:39:26.098348 I | proto: duplicate proto type registered: grpc.QueryIndexResponse
2021-12-17 20:39:26.098350 I | proto: duplicate proto type registered: grpc.Row
2021-12-17 20:39:26.098353 I | proto: duplicate proto type registered: grpc.IndexEntry
2021-12-17 20:39:26.098355 I | proto: duplicate proto type registered: grpc.QueryIndexRequest
2021-12-17 20:39:26.098358 I | proto: duplicate proto type registered: grpc.UpdateTableRequest
2021-12-17 20:39:26.098361 I | proto: duplicate proto type registered: grpc.DescribeTableResponse
2021-12-17 20:39:26.098363 I | proto: duplicate proto type registered: grpc.CreateTableRequest
2021-12-17 20:39:26.098366 I | proto: duplicate proto type registered: grpc.TableDesc
2021-12-17 20:39:26.098368 I | proto: duplicate proto type registered: grpc.TableDesc.TagsEntry
2021-12-17 20:39:26.098371 I | proto: duplicate proto type registered: grpc.ListTablesResponse
2021-12-17 20:39:26.098373 I | proto: duplicate proto type registered: grpc.Labels
2021-12-17 20:39:26.098492 I | proto: duplicate proto type registered: storage.Entry
2021-12-17 20:39:26.098503 I | proto: duplicate proto type registered: storage.ReadBatch
level=info ts=2021-12-17T20:39:26.104424711Z caller=main.go:94 msg="Starting Loki" version="(version=2.4.1, branch=HEAD, revision=f61a4d261)"
level=info ts=2021-12-17T20:39:26.104742789Z caller=server.go:260 http=[::]:3100 grpc=[::]:9095 msg="server listening on addresses"
level=info ts=2021-12-17T20:39:26.105408266Z caller=memberlist_client.go:394 msg="Using memberlist cluster node name" name=loki-loki-distributed-querier-0-bd677d02
ts=2021-12-17T20:39:26.105515764Z caller=memberlist_logger.go:74 level=debug msg="configured Transport is not a NodeAwareTransport and some features may not work as desired"
level=debug ts=2021-12-17T20:39:26.106076551Z caller=tcp_transport.go:389 msg=FinalAdvertiseAddr advertiseAddr=192.168.80.3 advertisePort=7946
level=debug ts=2021-12-17T20:39:26.106623651Z caller=tcp_transport.go:389 msg=FinalAdvertiseAddr advertiseAddr=192.168.80.3 advertisePort=7946
ts=2021-12-17T20:39:31.111316216Z caller=memberlist_logger.go:74 level=debug msg="Initiating push/pull sync with: 192.168.80.5:7946"
ts=2021-12-17T20:39:31.118814279Z caller=memberlist_logger.go:74 level=debug msg="Initiating push/pull sync with: 192.168.104.1:7946"
ts=2021-12-17T20:39:31.125709539Z caller=memberlist_logger.go:74 level=debug msg="Initiating push/pull sync with: 192.168.40.4:7946"
ts=2021-12-17T20:39:31.129798684Z caller=memberlist_logger.go:74 level=debug msg="Initiating push/pull sync with: 192.168.0.5:7946"
level=debug ts=2021-12-17T20:39:31.130616164Z caller=worker_service.go:149 msg="determining if querier is running as standalone target" runningStandalone=true queryFrontendEnabled=false queryScheduleEnabled=false readEnabled=false allEnabled=false
level=info ts=2021-12-17T20:39:31.130969834Z caller=worker.go:118 msg="Starting querier worker connected to query-frontend" frontend=loki-loki-distributed-query-frontend:9095
level=info ts=2021-12-17T20:39:31.132122215Z caller=module_service.go:64 msg=initialising module=server
level=debug ts=2021-12-17T20:39:31.132159294Z caller=module_service.go:54 msg="module waiting for initialization" module=store waiting_for=ingester-querier
level=debug ts=2021-12-17T20:39:31.132209577Z caller=module_service.go:54 msg="module waiting for initialization" module=ingester-querier waiting_for=memberlist-kv
level=debug ts=2021-12-17T20:39:31.132200668Z caller=module_service.go:54 msg="module waiting for initialization" module=querier waiting_for=ingester-querier
level=debug ts=2021-12-17T20:39:31.132275534Z caller=module_service.go:54 msg="module waiting for initialization" module=ring waiting_for=memberlist-kv
level=info ts=2021-12-17T20:39:31.132281669Z caller=module_service.go:64 msg=initialising module=memberlist-kv
level=debug ts=2021-12-17T20:39:31.132377902Z caller=module_service.go:54 msg="module waiting for initialization" module=ring waiting_for=server
level=info ts=2021-12-17T20:39:31.132394615Z caller=module_service.go:64 msg=initialising module=ring
level=debug ts=2021-12-17T20:39:31.133995333Z caller=module_service.go:54 msg="module waiting for initialization" module=ingester-querier waiting_for=ring
ts=2021-12-17T20:39:31.135239045Z caller=memberlist_logger.go:74 level=debug msg="Initiating push/pull sync with: 192.168.40.5:7946"
level=debug ts=2021-12-17T20:39:31.135334494Z caller=module_service.go:54 msg="module waiting for initialization" module=ingester-querier waiting_for=server
level=info ts=2021-12-17T20:39:31.135437402Z caller=module_service.go:64 msg=initialising module=ingester-querier
level=debug ts=2021-12-17T20:39:31.135645783Z caller=module_service.go:54 msg="module waiting for initialization" module=querier waiting_for=memberlist-kv
level=debug ts=2021-12-17T20:39:31.135763797Z caller=module_service.go:54 msg="module waiting for initialization" module=querier waiting_for=ring
level=debug ts=2021-12-17T20:39:31.135864266Z caller=module_service.go:54 msg="module waiting for initialization" module=querier waiting_for=server
level=debug ts=2021-12-17T20:39:31.135910366Z caller=module_service.go:54 msg="module waiting for initialization" module=querier waiting_for=store
level=debug ts=2021-12-17T20:39:31.135947607Z caller=module_service.go:54 msg="module waiting for initialization" module=store waiting_for=memberlist-kv
level=debug ts=2021-12-17T20:39:31.135998775Z caller=module_service.go:54 msg="module waiting for initialization" module=store waiting_for=ring
level=debug ts=2021-12-17T20:39:31.136044975Z caller=module_service.go:54 msg="module waiting for initialization" module=store waiting_for=server
level=info ts=2021-12-17T20:39:31.136093662Z caller=module_service.go:64 msg=initialising module=store
level=info ts=2021-12-17T20:39:31.136162376Z caller=module_service.go:64 msg=initialising module=querier
level=info ts=2021-12-17T20:39:31.136841905Z caller=loki.go:318 msg="Loki started"
level=info ts=2021-12-17T20:39:31.137552251Z caller=memberlist_client.go:506 msg="joined memberlist cluster" reached_nodes=5
level=info ts=2021-12-17T20:39:31.14032515Z caller=worker.go:205 msg="adding connection" addr=192.168.0.11:9095
level=warn ts=2021-12-17T20:39:31.140556783Z caller=worker.go:268 msg="total worker concurrency is greater than promql max concurrency. Queries may be queued in the querier which reduces QOS"
level=debug ts=2021-12-17T20:40:01.635599626Z caller=logging.go:67 traceID=2b1428317ab3a623 orgID=fake msg="GET /ready (200) 173.496µs"
ts=2021-12-17T20:40:03.95294199Z caller=memberlist_logger.go:74 level=debug msg="Initiating push/pull sync with: loki-loki-distributed-distributor-7745846956-fntjp-c830ed9b 192.168.80.5:7946"
level=debug ts=2021-12-17T20:40:11.635458034Z caller=logging.go:67 traceID=0752abf101843d53 orgID=fake msg="GET /ready (200) 91.401µs"
level=debug ts=2021-12-17T20:40:21.635386767Z caller=logging.go:67 traceID=6a5c91e34c70de62 orgID=fake msg="GET /ready (200) 85.358µs"
level=debug ts=2021-12-17T20:40:31.635313832Z caller=logging.go:67 traceID=067ae958c96da13c orgID=fake msg="GET /ready (200) 92.163µs"
ts=2021-12-17T20:40:33.956127909Z caller=memberlist_logger.go:74 level=debug msg="Initiating push/pull sync with: loki-loki-distributed-ingester-1-c5c5835d 192.168.80.4:7946"
ts=2021-12-17T20:40:38.203226764Z caller=memberlist_logger.go:74 level=debug msg="Stream connection from=192.168.40.5:47758"
level=debug ts=2021-12-17T20:40:41.635377583Z caller=logging.go:67 traceID=2128778068f8c388 orgID=fake msg="GET /ready (200) 98.474µs"
ts=2021-12-17T20:40:46.011753182Z caller=memberlist_logger.go:74 level=debug msg="Stream connection from=192.168.40.4:54240"
level=debug ts=2021-12-17T20:40:51.636167044Z caller=logging.go:67 traceID=470f64e08e1c45d3 orgID=fake msg="GET /ready (200) 107.945µs"
level=debug ts=2021-12-17T20:41:01.635570039Z caller=logging.go:67 traceID=3396057c1584c79d orgID=fake msg="GET /ready (200) 110.94µs"
ts=2021-12-17T20:41:03.96657786Z caller=memberlist_logger.go:74 level=debug msg="Initiating push/pull sync with: loki-loki-distributed-compactor-5fbbbc57cf-zjpmh-18588d5f 192.168.104.2:7946"
level=debug ts=2021-12-17T20:41:11.635349911Z caller=logging.go:67 traceID=11294384adbd1cf8 orgID=fake msg="GET /ready (200) 61.693µs"
without debug enabled:
loki2/loki-loki-distributed-query-frontend-c947c647b-zm82n[loki]: level=error ts=2021-12-17T18:24:35.278379163Z caller=retry.go:73 org_id=fake msg="error processing request" try=0 err="rpc error: code = Code(500) desc = rpc error: code = Unavailable desc = name resolver error: produced zero addresses\n"
loki2/loki-loki-distributed-querier-0[loki]: level=error ts=2021-12-17T18:24:35.278090494Z caller=chunk_store.go:524 org_id=fake msg="error querying storage" err="rpc error: code = Unavailable desc = name resolver error: produced zero addresses"
loki2/loki-loki-distributed-querier-0[loki]: level=warn ts=2021-12-17T18:24:35.278178895Z caller=logging.go:72 traceID=3aed26e94f67384e orgID=fake msg="GET /loki/api/v1/labels?end=1639765475270116729&start=1639764874731000000 (500) 4.689324ms Response: \"rpc error: code = Unavailable desc = name resolver error: produced zero addresses\\n\" ws: false; X-Scope-Orgid: fake; uber-trace-id: 3aed26e94f67384e:034b0791ba0afd6b:415e5d90759cb78b:0; "
loki2/loki-loki-distributed-querier-0[loki]: level=error ts=2021-12-17T18:24:35.30565576Z caller=chunk_store.go:524 org_id=fake msg="error querying storage" err="rpc error: code = Unavailable desc = name resolver error: produced zero addresses"
loki2/loki-loki-distributed-querier-0[loki]: level=warn ts=2021-12-17T18:24:35.305831754Z caller=logging.go:72 traceID=3aed26e94f67384e orgID=fake msg="GET /loki/api/v1/labels?end=1639765475270116729&start=1639764874731000000 (500) 27.118854ms Response: \"rpc error: code = Unavailable desc = name resolver error: produced zero addresses\\n\" ws: false; X-Scope-Orgid: fake; uber-trace-id: 3aed26e94f67384e:034b0791ba0afd6b:415e5d90759cb78b:0; "
loki2/loki-loki-distributed-query-frontend-c947c647b-zm82n[loki]: level=error ts=2021-12-17T18:24:35.306224589Z caller=retry.go:73 org_id=fake msg="error processing request" try=1 err="rpc error: code = Code(500) desc = rpc error: code = Unavailable desc = name resolver error: produced zero addresses\n"
loki2/loki-loki-distributed-querier-0[loki]: level=error ts=2021-12-17T18:24:35.309880753Z caller=chunk_store.go:524 org_id=fake msg="error querying storage" err="rpc error: code = Unavailable desc = name resolver error: produced zero addresses"
loki2/loki-loki-distributed-querier-0[loki]: level=warn ts=2021-12-17T18:24:35.309966476Z caller=logging.go:72 traceID=3aed26e94f67384e orgID=fake msg="GET /loki/api/v1/labels?end=1639765475270116729&start=1639764874731000000 (500) 3.327181ms Response: \"rpc error: code = Unavailable desc = name resolver error: produced zero addresses\\n\" ws: false; X-Scope-Orgid: fake; uber-trace-id: 3aed26e94f67384e:034b0791ba0afd6b:415e5d90759cb78b:0; "
loki2/loki-loki-distributed-query-frontend-c947c647b-zm82n[loki]: level=error ts=2021-12-17T18:24:35.31026562Z caller=retry.go:73 org_id=fake msg="error processing request" try=2 err="rpc error: code = Code(500) desc = rpc error: code = Unavailable desc = name resolver error: produced zero addresses\n"
loki2/loki-loki-distributed-querier-0[loki]: level=error ts=2021-12-17T18:24:35.313429444Z caller=chunk_store.go:524 org_id=fake msg="error querying storage" err="rpc error: code = Unavailable desc = name resolver error: produced zero addresses"
loki2/loki-loki-distributed-querier-0[loki]: level=warn ts=2021-12-17T18:24:35.313517291Z caller=logging.go:72 traceID=3aed26e94f67384e orgID=fake msg="GET /loki/api/v1/labels?end=1639765475270116729&start=1639764874731000000 (500) 2.83642ms Response: \"rpc error: code = Unavailable desc = name resolver error: produced zero addresses\\n\" ws: false; X-Scope-Orgid: fake; uber-trace-id: 3aed26e94f67384e:034b0791ba0afd6b:415e5d90759cb78b:0; "
loki2/loki-loki-distributed-querier-0[loki]: level=error ts=2021-12-17T18:24:35.316181803Z caller=chunk_store.go:524 org_id=fake msg="error querying storage" err="rpc error: code = Unavailable desc = name resolver error: produced zero addresses"
loki2/loki-loki-distributed-querier-0[loki]: level=warn ts=2021-12-17T18:24:35.31627526Z caller=logging.go:72 traceID=3aed26e94f67384e orgID=fake msg="GET /loki/api/v1/labels?end=1639765475270116729&start=1639764874731000000 (500) 2.31158ms Response: \"rpc error: code = Unavailable desc = name resolver error: produced zero addresses\\n\" ws: false; X-Scope-Orgid: fake; uber-trace-id: 3aed26e94f67384e:034b0791ba0afd6b:415e5d90759cb78b:0; "
loki2/grafana-678f986c7-qmnkb[grafana]: t=2021-12-17T18:24:35+0000 lvl=eror msg="Request Completed" logger=context userId=1 orgId=1 uname=admin method=GET path=/api/datasources/proxy/2/loki/api/v1/label status=500 remote_addr=10.220.115.238 time_ms=50 size=82 referer=https://grafana-icboca.risk.regn.net/datasources/edit/kk0KbSo7z
loki2/loki-loki-distributed-query-frontend-c947c647b-zm82n[loki]: level=error ts=2021-12-17T18:24:35.313710476Z caller=retry.go:73 org_id=fake msg="error processing request" try=3 err="rpc error: code = Code(500) desc = rpc error: code = Unavailable desc = name resolver error: produced zero addresses\n"
loki2/loki-loki-distributed-query-frontend-c947c647b-zm82n[loki]: level=error ts=2021-12-17T18:24:35.316489147Z caller=retry.go:73 org_id=fake msg="error processing request" try=4 err="rpc error: code = Code(500) desc = rpc error: code = Unavailable desc = name resolver error: produced zero addresses\n"
loki2/loki-loki-distributed-query-frontend-c947c647b-zm82n[loki]: level=warn ts=2021-12-17T18:24:35.316576782Z caller=logging.go:72 traceID=3aed26e94f67384e orgID=fake msg="GET /loki/api/v1/label?start=1639764874731000000 (500) 46.538652ms Response: \"rpc error: code = Unavailable desc = name resolver error: produced zero addresses\\n\" ws: false; Accept: application/json, text/plain, */*; Accept-Encoding: gzip, deflate, br; Accept-Language: en-US,en;q=0.5; Dnt: 1; Sec-Fetch-Dest: empty; Sec-Fetch-Mode: cors; Sec-Fetch-Site: same-origin; Sec-Gpc: 1; User-Agent: Grafana/8.3.0; X-Forwarded-For: 10.220.115.238, 192.168.80.0, 192.168.80.0; X-Grafana-Nocache: true; X-Grafana-Org-Id: 1; X-Real-Ip: 10.220.115.238; X-Request-Id: 97b32aaa89d01331794683097835af95; X-Scheme: https;
Here is my config file.
auth_enabled: false
server:
http_listen_port: 3100
log_level: debug
distributor:
ring:
kvstore:
store: memberlist
memberlist:
join_members:
- {{ include "loki.fullname" . }}-memberlist
ingester:
lifecycler:
ring:
kvstore:
store: memberlist
replication_factor: 3
chunk_idle_period: 1h
chunk_block_size: 1536000
chunk_encoding: snappy
chunk_retain_period: 1m
max_transfer_retries: 0
wal:
dir: /var/loki/wal
limits_config:
ingestion_rate_strategy: local
enforce_metric_name: false
reject_old_samples: true
reject_old_samples_max_age: 168h
max_cache_freshness_per_query: 10m
ingestion_rate_mb: 15
ingestion_burst_size_mb: 20
per_stream_rate_limit: 10MB
schema_config:
configs:
- from: 2020-09-07
store: boltdb-shipper
object_store: aws
schema: v11
index:
prefix: loki_index_
period: 24h
storage_config:
aws:
s3: s3://xxxx:8080/lokibucket
s3forcepathstyle: true
index_queries_cache_config:
memcached:
batch_size: 100
parallelism: 100
memcached_client:
consistent_hash: true
host: {{ include "loki.memcachedIndexQueriesFullname" . }}
service: http
boltdb_shipper:
active_index_directory: /var/loki/index
shared_store: s3
cache_location: /var/loki/cache
index_gateway_client:
server_address: dns:///{{ include "loki.indexGatewayFullname" . }}:9095
chunk_store_config:
chunk_cache_config:
memcached:
batch_size: 100
parallelism: 100
memcached_client:
consistent_hash: true
host: {{ include "loki.memcachedChunksFullname" . }}
service: http
table_manager:
retention_deletes_enabled: false
retention_period: 0s
querier:
query_store_only: true
query_range:
# make queries more cache-able by aligning them with their step intervals
align_queries_with_step: true
max_retries: 5
# parallelize queries in 15min intervals
split_queries_by_interval: 15m
cache_results: true
results_cache:
cache:
# We're going to use the in-process "FIFO" cache
enable_fifocache: true
fifocache:
size: 1024
validity: 24h
frontend_worker:
frontend_address: {{ include "loki.queryFrontendFullname" . }}:9095
frontend:
log_queries_longer_than: 5s
compress_responses: true
tail_proxy_url: http://{{ include "loki.querierFullname" . }}:3100
compactor:
working_directory: /loki/boltdb-shipper-compactor
shared_store: s3
ruler:
storage:
type: local
local:
directory: /etc/loki/rules
ring:
kvstore:
store: memberlist
rule_path: /tmp/loki/scratch
alertmanager_url: https://alertmanager.xx
external_url: https://alertmanager.xx
k get pods
loki-loki-distributed-compactor-547979c8d8-rtbr7 1/1 Running 0 10d
loki-loki-distributed-distributor-55b4f64f9-55g5p 1/1 Running 0 10d
loki-loki-distributed-distributor-55b4f64f9-59th7 1/1 Running 0 10d
loki-loki-distributed-distributor-55b4f64f9-jb2n4 1/1 Running 0 10d
loki-loki-distributed-gateway-7896dbfb9-hb292 1/1 Running 0 12d
loki-loki-distributed-gateway-7896dbfb9-j6xvr 1/1 Running 0 12d
loki-loki-distributed-gateway-7896dbfb9-v5z7l 1/1 Running 0 12d
loki-loki-distributed-ingester-0 1/1 Running 0 10d
loki-loki-distributed-ingester-1 1/1 Running 0 10d
loki-loki-distributed-ingester-2 1/1 Running 0 10d
loki-loki-distributed-memcached-chunks-0 2/2 Running 0 12d
loki-loki-distributed-memcached-chunks-1 2/2 Running 0 12d
loki-loki-distributed-memcached-chunks-2 2/2 Running 0 12d
loki-loki-distributed-memcached-chunks-3 2/2 Running 0 12d
loki-loki-distributed-memcached-chunks-4 0/2 Pending 0 12d
loki-loki-distributed-memcached-frontend-0 2/2 Running 0 12d
loki-loki-distributed-memcached-frontend-1 2/2 Running 0 12d
loki-loki-distributed-memcached-frontend-2 2/2 Running 0 12d
loki-loki-distributed-memcached-index-queries-0 2/2 Running 0 10d
loki-loki-distributed-memcached-index-queries-1 2/2 Running 0 10d
loki-loki-distributed-memcached-index-queries-2 2/2 Running 0 10d
loki-loki-distributed-memcached-index-writes-0 2/2 Running 0 12d
loki-loki-distributed-memcached-index-writes-1 2/2 Running 0 12d
loki-loki-distributed-memcached-index-writes-2 2/2 Running 0 12d
loki-loki-distributed-querier-0 1/1 Running 0 10d
loki-loki-distributed-querier-1 1/1 Running 0 10d
loki-loki-distributed-query-frontend-c947c647b-zm82n 1/1 Running 0 10d
k get services
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE
grafana ClusterIP 192.168.163.89 <none> 80/TCP 12d
loki-loki-distributed-compactor ClusterIP 192.168.186.160 <none> 3100/TCP 12d
loki-loki-distributed-distributor ClusterIP 192.168.195.44 <none> 3100/TCP,9095/TCP 12d
loki-loki-distributed-gateway ClusterIP 192.168.159.20 <none> 80/TCP 12d
loki-loki-distributed-ingester ClusterIP 192.168.168.177 <none> 3100/TCP,9095/TCP 12d
loki-loki-distributed-ingester-headless ClusterIP None <none> 3100/TCP,9095/TCP 12d
loki-loki-distributed-memberlist ClusterIP None <none> 7946/TCP 12d
loki-loki-distributed-memcached-chunks ClusterIP None <none> 11211/TCP,9150/TCP 12d
loki-loki-distributed-memcached-frontend ClusterIP None <none> 11211/TCP,9150/TCP 12d
loki-loki-distributed-memcached-index-queries ClusterIP None <none> 11211/TCP,9150/TCP 10d
loki-loki-distributed-memcached-index-writes ClusterIP None <none> 11211/TCP,9150/TCP 12d
loki-loki-distributed-querier ClusterIP 192.168.165.99 <none> 3100/TCP,9095/TCP 12d
loki-loki-distributed-querier-headless ClusterIP None <none> 3100/TCP,9095/TCP 12d
loki-loki-distributed-query-frontend ClusterIP None <none> 3100/TCP,9095/TCP,9096/TCP 12d
k get ingress
NAME CLASS HOSTS ADDRESS PORTS AGE
grafana <none> grafana-xxx.net 10.xxx,10.xxx,10.xxx,10.xxx 80, 443 11d
loki-loki-distributed-gateway <none> loki-xxx.net 10.xxx,10.xxx,10.xxx,10.xxx 80, 443 12d
The following logs message was of particular interest to me.
level=debug ts=2021-12-17T20:39:31.130616164Z caller=worker_service.go:149 msg="determining if querier is running as standalone target" runningStandalone=true queryFrontendEnabled=false queryScheduleEnabled=false readEnabled=false allEnabled=false