AWS Cloudwatch logs

I’m a fan of how Grafana can bring my metrics from different accounts and regions into one place:

Though is there a solution to do the same with logs? I know grafana isn’t a log viewer, so how do people address this problem with an opensource tool?

Stream all logs to ES https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_ES_Stream.html and build your Grafana dashboards with ES datasource. Nice, easy and scalable solution.

ElasticSearch is the opensource tool for this job. you can use AWS hosted service of elasticsearch or you can run your own stack on EC2 instances. Doing it this way will require logstash, elasticsearch and kibana to be setup. logstash ingests the logs through a native plugin for cloudwatch then ships them to elasticsearch which handles the indexing and storage. Kibana is the web interface used for searching and graphing the data among other things.

some key things to consider is the learning curve for each piece. it’s not hard to get started, but it can get complex when you want to start doing data manipulation or index optimizations. The AWS hosted Elasticsearch service does not give you full control over all the settings available but is easier to manage.

Setting up your own cluster on EC2 instances will give you more control over optimizations and data manipulation but will have more overhead in terms of management. setting up high availability for elasticsearch requires a minimum of 3 nodes which can lead to higher costs. For Reference, I manage a small cluster for my company that has 2 EC2 instances running logstash that load balance to 3 EC2 instances elasticsearch and 1 running kibana.

Before you start you will need to estimate how much data will be moving from cloudwatch to elasticsearch. If it’s less than 10’s of GB’s per day then you can use smaller instances with a minimum of 2GB (preferably 4GB) of memory for elasticsearch. But larger data sets will require instances with more memory which can add to cost. This will help you get an idea for the budget you will need.

Also, as your data sets increase in size the bottlenecks to look out for are memory utilization and storage / network bandwidth, which can be hard to diagnose in amazon’s environment. But there are settings that are easy to modify to fit your use case of elasticsearch such as favoring resources for indexing rather than searching.

Keep in mind that there are a lot of guides and documentation for elasticsearch, but you need to make sure it’s for the same version your using. Even between minor versions of elasticsearch there are pretty big changes sometimes, so reference the release notes just to be safe.

It might seem like a lot, but the setup is beginner friendly and once you see how powerful it is it becomes very useful.

Also, Grafana can read data from elasticsearch, but it’s not really for viewing logs, kibana is better at that. instead I’ve used it for plotting time series data such as the number of 4xx errors from a web app log or tracking elasticsearch specific metrics like indexing times.

Hopefully this helps out, I’m happy to answer any questions as well