Metric of amount of logs

I have metric for the amount of logs (in bytes) that comes from our services. its a special metrics created by a cronjob. It posts 1 point every 24 hours, indicating the index size of our logs.
I want to create smart alert on that for growth that is unexpected ~+30% a day.

Im new it Grafana and chat GPT doesn’t give me the best solution.
I created this alert : elasticsearch_logs_index_size_bytes/elasticsearch_logs_index_size_bytes offset 24h>1.3

The current alert only triggers for a 30% increase from the previous day’s value. This means consistent daily increases below 30%, such as 29%, go undetected. A more effective alert is needed to capture these consistent smaller increases.