With the sudden increase in no.of tools and applications required to manage the IT environment, it is critical for any enterprise to analyze various logs coming generated by different systems and correlate to track , prevent and minimize the infrastructure/application outage.Effective log management is fast becoming a necessity for IT compliance and security management.
Log management typically involves the below steps:
- Log and context data collection
- Normalization and categorization (parsing and pre-processing)
- Correlation (could require synchronization)
- Notification / Alerting
- Reporting / Dashboards (visualization)
- Backup and long term storage (archival)
- Purging of expired log data
There are various kinds of logs that get generated. The following are some:
|Audit logs||System performance records|
|Transaction logs||User activity logs|
|Intrusion logs||Different systems alerts|
|Connection logs||different other systems messages|
The logs get generated from many devices (Switches, Routers, Firewalls etc), Systems Software (Operating System, Databases, Web Servers etc) & Application logs.
Below is a pictorial representation of the sequential steps in a log processing solution.
However, log management goes further beyond till archival and eventual purging of log data that has gone past its utility date.
There are myriad tools, both FOSS and COTS available for dealing with each step involved in the log management life cycle. Some of them are listed below:
- Shippers : rsyslog, SyslogNG
- Parsers/Filters : FluentD, Logstash, etc.
- Indexing and Search: ElasticSearch, Solr
- Visualizer : Kibana, Greylog
There are other tools which encompass multiple-functions for more end-to-end solutioning.In the next article we will discuss about the Log Operations Management through CoreStack with ELK.