X

Want to analyze big data? Check your log files

Log files offer a wealth of information about system and user functions. They also provide a safe way to get started analyzing large data sets.

Dave Rosenberg Co-founder, MuleSource
Dave Rosenberg has more than 15 years of technology and marketing experience that spans from Bell Labs to startup IPOs to open-source and cloud software companies. He is CEO and founder of Nodeable, co-founder of MuleSoft, and managing director for Hardy Way. He is an adviser to DataStax, IT Database, and Puppet Labs.
Dave Rosenberg
2 min read

More than a few technology sectors seem to be turning up the volume on "big data" and the enormous challenges and opportunities that enterprises face in managing and analyzing their data and system resources.

There are a number of hip technologies and frameworks like Apache Hadoop, which is used to store, process, and analyze massive data sets, enabling applications to work with thousands of nodes and petabytes of data.

Log management
Log management LogLogic

One area that provides never-ending data analysis fodder are log files. For those not aware, log files are usually automatically created and updated whenever a machine or machine user does something. Logs are often been put under the "dark matter" umbrella, signifying the challenge of mining useful information from raw data.

But, operating system and application logs are a goldmine of vital information about the health and well-being of an organization's computer infrastructure. Plus, they can record the day-to-day activity of system users as well as capture evidence of malicious activity.

I spoke with Dimitri McKay, security architect for LogLogic, and asked him to provide a few examples of real-world use cases demonstrating how logs are used for business analytics and intelligence purposes in the enterprise.

Example 1--A global retail company uses log analysis to meet regulations established by the PCI DSS compliance standards. Comprehensive reporting capabilities and secure long-term storage capacity are critical elements that must be met, and in order to support forensic analysis, all data must not only be stored but also encrypted.

Example 2--A customer in the telecom industry was overwhelmed with the sheer amount of log data they were forced to consume for both forensics and operations. They were doubling their amount of log data storage every nine months, and the home-grown solution they had been using was just not keeping up.

This company wanted the ability to track a session from start to finish across their entire infrastructure for forensics and operations. With that it could see where sessions were failing, it could reduce downtime and increase the value of the user experience.

Example 3--A global financial firm analyzes its log data to increase and improve network performance as well as for intrusion detection of the infrastructure, scanning for vulnerabilities and vulnerability assessment.

Keep an eye on the buzz meter to see how vendors address the impending data explosion by providing solutions that help enterprises take advantage of these massive data sets.

Follow me on Twitter @daveofdoom.