Ingest EDN files

Hi,

I trying to ingest logs from Datomic database. But it uses the EDN pattern, example below:

2024-10-17 15:19:03.060 INFO  default    org.eclipse.jetty.util.log - Logging initialized @1902ms to org.eclipse.jetty.util.log.Slf4jLog
2024-10-17 15:19:03.346 INFO  default    datomic.slf4j.bridge - SLF4J Bridge installed
2024-10-17 15:19:03.353 INFO  default    datomic.transactor - {"datomic.metricsCallback" datomic-exporter.metrics/ingest-metrics, :tid 20, "datomic.printConnectionInfo" true, "datomic.prefetchProbes" true, "datomic.memcachedLib" "spy", "datomic.txTimeoutMsec" 10000, "datomic.indexWorkDir" #object[java.io.File 0x47d720e "./data/indexer"], "datomic.heartbeatIntervalMsec" 5000, "datomic.memoryIndexThreshold" 33554432, "datomic.s3RetryBaseDelay" 100, "datomic.dynamicIndexParallelism" false, "datomic.s3ClientExecutionTimeout" 5500, "datomic.backupBranchConcurrency" 32, "datomic.indexParallelism" 1, "datomic.localMemcachedConfigTimeoutMsec" 100, "datomic.queryPool" 32, "datomic.versionUnique" "7187", "datomic.objectCacheMax" 134217728, "datomic.allowLogOverlap" false, "datomic.externalSortPool" 4, "datomic.ddbClientExecutionTimeout" 1100, "datomic.readAheadPool" 12, "datomic.version" "1.0.7187", "datomic.ddbSocketTimeout" 900, "datomic.s3RequestTimeout" 5000, "datomic.backupUseSegsetStorage" true, "datomic.efsDeletePool" 128, :pid 92854, :event :config/properties, "datomic.dataDir" #object[java.io.File 0x61b7fef2 "./data"], "datomic.s3BackupConcurrency" 25, "datomic.s3ConnectionTimeout" 4500, "datomic.memoryIndexMax" 268435456, "datomic.ddbConnectionTimeout" 900, "datomic.s3SocketTimeout" 4500, "datomic.memcachedGetTimeoutMsec" 20, "datomic.valcachePutsPool" 4, "datomic.fileBackupConcurrency" 5, "datomic.writeConcurrency" 4, "datomic.memcachedRepairFromSpyRatio" 0.1, "datomic.useIndexArrayCaches" true, "datomic.localMemcachedAutoDiscovery" false, "datomic.s3MaxRetries" 9, "datomic.podGcDelayMsec" 60000, "datomic.memcachedExpirationDays" 30, "datomic.defaultPartition" :db.part/user, "datomic.indexIOParallelism" 100, "datomic.readConcurrency" 8, "datomic.efsWritePool" 128, "datomic.indexDirScale" 1, "datomic.memcachedAutoDiscovery" false, "datomic.cloudwatchName" "Transactor", "datomic.peerConnectionTTLMsec" 10000, "datomic.memcachedConfigTimeoutMsec" 100, "datomic.deleteConcurrency" 1, "datomic.exciseIOParallelism" 100, "datomic.ddbRequestTimeout" 1000, "datomic.buildRevision" 7187, "datomic.prefetchConcurrency" 5}
2024-10-17 15:19:03.558 INFO  default    datomic.process-monitor - {:event :metrics/initializing, :metricsCallback datomic-exporter.metrics/ingest-metrics, :phase :begin, :pid 92854, :tid 20}
2024-10-17 15:19:04.058 INFO  default    datomic.log-gc - {:gcName "G1 Young Generation", :gcAction "end of minor GC", :gcCause "Metadata GC Threshold", :event :gc, :duration 8, :pid 92854, :tid 13}
2024-10-17 15:19:05.092 INFO  default    datomic.log-gc - {:gcName "G1 Young Generation", :gcAction "end of minor GC", :gcCause "G1 Evacuation Pause", :event :gc, :duration 10, :pid 92854, :tid 13}
2024-10-17 15:19:05.604 INFO  default    datomic.process-monitor - {:event :metrics/initializing, :metricsCallback datomic-exporter.metrics/ingest-metrics, :msec 2040.0, :phase :end, :pid 92854, :tid 20}
2024-10-17 15:19:05.604 INFO  default    datomic.process-monitor - {:metrics/started datomic-exporter.metrics/ingest-metrics, :pid 92854, :tid 20}
2024-10-17 15:19:05.605 INFO  default    datomic.domain - {:event :cache/create, :cache-bytes 134217728, :pid 92854, :tid 30}
2024-10-17 15:19:05.609 INFO  default    datomic.process-monitor - {:GcPauseMsec {:lo 8, :hi 10, :sum 18, :count 2}, :AvailableMB 725.0, :ObjectCacheCount 0, :event :metrics, :pid 92854, :tid 30}
2024-10-17 15:19:05.611 INFO  default    common-metrics.components.prometheus - {:line 67, :cid "DEFAULT", :log :starting-prometheus-reporter}
2024-10-17 15:19:05.629 INFO  default    datomic.lifecycle - {:tid 32, :username "XwBRRa99KZhW7EhMVaj01SD5SrJvPqvprZZG9Xes9Sk=", :port 4334, :rev 73468, :host "localhost", :pid 92854, :event :transactor/heartbeat, :version "1.0.7187", :timestamp 1729189145619, :encrypt-channel true}
2024-10-17 15:19:05.629 INFO  default    datomic.transactor - {:event :transactor/start, :args {:log-dir "log", :protocol :dev, :rest-alias "dev", :memory-index-max "256m", :port 4334, :memory-index-threshold "32m", :data-dir "./data", :object-cache-max "128m", :host "localhost", :metrics-callback "datomic-exporter.metrics/ingest-metrics", :version "1.0.7187", :encrypt-channel true}, :pid 92854, :tid 20}
2024-10-17 15:19:05.688 INFO  default    o.a.activemq.artemis.core.server - AMQ221000: live Message Broker is starting with configuration Broker Configuration (clustered=false,journalDirectory=./data/artemis,bindingsDirectory=data/bindings,largeMessagesDirectory=data/largemessages,pagingDirectory=data/paging)

I’m trying to use with promtail, but with little success. How would be a way to ingest this type data (with nested info at EDN) to loki to be able to search? Any tips or direction?

I could not find any thing =/

First time I’ve seen this format. Looks like inverted JSON, sort of.

That said, I don’t know of anything that can process this nicely. Even fluentd doesn’t seem to support it. I think you have a couple of options:

  1. Forward the logs to Loki as is. This, however, will make parsing the logs rather difficulty.
  2. Write some sort of service or a simple script to concert EDN files into JSON format, then have promtail or Alloy agent pick up from the JSON files. You’d of course have to invest some resources into writing that service to do the conversion.

You can always talk to Datomic about logging in a more standard format? lol this probably won’t get you anywhere of course, looks like they’ve been around for some time.