Loki miss chunk file in /loki/chunks/fake

Sometimes Loki and Promtail would be rebuild in k8s for updating config or collect old logs, and I find that many directory are under /loki/chunks/fake.

I know that ‘fake’ was the orgID configuration, and I want to know what is the files under the path ‘/loki/chunks/fake’?

It seems like 16 bits num , how these directory names produced?
This make Loki can’t find chunks.

The index is the index of log labels, and the chunk are the actual logs grouped in specific set of labels. When reading logs loki first narrow down the chunk files to read by search for relevant labels in the index file, therefore the index file is essentially how you find what chunks to read.

If you have an error of missing chunk file that means either the chunk was removed outside of Loki, or some sort of misconfiguration that resulted in incorrect storage of files.

Thanks for your answer.

But there is still something confusing me.

500 Internal Server Error: [open /loki/chunks/fake/c7bcc606af799e49/MThmNTBmNjYwOWU6MThmNTI3NDM1M2I6MjkxNDIwZGY=: no such file or directory

1.I rebuilded Loki on May 8th
2.When I query the logs after May 8th, everything is nice. When I query logs before May 8th, Loki show me the error response.
3.I have been sure that the ‘/c7bcc606af799e49’ path does not exist.
4.I havn’t changed the static label in promtail, and I can see that by requesting loki api.

What confuses me is that:

1.If Loki know to find the chunk file ‘/loki/chunks/fake/c7bcc606af799e49/MThmNTBmNjYwOWU6MThmNTI3NDM1M2I6MjkxNDIwZGY=’ , it means the index file are providing services normally.
2.I have seen that chunks files were saved under the path ‘/loki/chunks/fake’, how is the directory ‘/c7bcc606af799e49’ produced?
3.I don’t delete any chunks files or remove them, and I,m sure about that the nobhub path is same before I rebuilded Loki, and I can find the chunks files. It’s that ‘/c7bcc606af799e49’ was calculate by some way or configuration?

The path where the chunks are stored are calculated when flushing. If I remember correctly they are hashes of the chunks being written.

One scenario is if when you rebuilt your Loki cluster and you did not power off your previous cluster cleanly you could have some chunks not fully flushed.

Hashes of the chunks?

It means the content about the log? Or calculated by labels?
That’s so many directoried under /loki/chunks/fake, I really want to sure the rule about calculating the hash value, but I can’t find these in the program code.

Is there any tips about the code?

What’s more, I have some chunk files was under /loki/chunks path.

So I think Loki wouldn’t calculate the hash sometimes, It maybe some config impact that the path chunk files saved?

I am not very knowledgable in the source code, but I believe you’d want to look at the chunk client code, specifically loki/pkg/storage/chunk/client/object_client.go at main · grafana/loki · GitHub.

thanks a lot, I think I fix the problem.

I use Loki 2.5.0, I read about the source code and I find that maybe the ‘schema_config’ make a mistake.

I add the schema_config:
- from: 2020-10-24
store: boltdb-shipper
object_store: filesystem
schema: v11
prefix: index_
period: 24h
prefix: “”
period: 24h
- from: 2024-01-01
store: boltdb-shipper
object_store: filesystem
schema: v12
prefix: index_
period: 24h
prefix: “”
period: 24h

And Loki put chunk in this way:
func (o *Client) PutChunks(ctx context.Context, chunks chunk.Chunk) error {
var (
chunkKeys string
chunkBufs byte

for i := range chunks {
for i := range chunkBufs {
	go func(i int) {
		incomingErrors <- o.store.PutObject(ctx, chunkKeys[i], bytes.NewReader(chunkBufs[i]))

func (cfg SchemaConfig) ExternalKey(chunk Chunk) string {
p, err := cfg.SchemaForTime(chunk.From)
v, _ := p.VersionAsInt()
if err == nil && v >= 12 {
return cfg.newerExternalKey(chunk)
} else if chunk.ChecksumSet {
return cfg.newExternalKey(chunk)
} else {
return cfg.legacyExternalKey(chunk)

So chunks were under different path with different schema_config