The convergence of SIEM and log management

Though Security Information and Event Management and log management tools have been complementary for years, the technologies are expected to merge. Here's a look at what you can expect in second-generation log management and SIEM solutions.

SIEM emerged as companies found themselves spending a lot of money on intrusion detection/prevention systems (IDS/IPS). These systems were helpful in detecting external attacks, but because of the reliance on signature-based engines, generated a large number of false positives.

The first-generation SIEM technology was designed to reduce this signal-to-noise ratio and help surface the most critical external threats. Using rule-based correlation, SIEM helped IT detect real attacks by focusing on a subset of firewall and IDS/IPS events that were in violation of policy. Traditionally, SIEM solutions have been expensive and time-intensive to maintain and tweak, but they solve the big headache of sorting through excessive false alerts and effectively protect companies from external threats.

While that was a step in the right direction, the world got more complicated when new regulations such as the Sarbanes-Oxley Act and the Payment Card Industry Data Security Standard mandated stricter internal IT controls and assessment. To satisfy these requirements, organizations are required to collect, analyze, report on and archive all logs to monitor activities inside their IT infrastructures.

The idea is not only to detect external threats, but also to provide periodic reports of user activities and create forensics reports surrounding a given incident. Though SIEM technologies already collect logs, they process only a subset related to security breaches. They weren't designed to handle the sheer volume of log data generated from all IT components, such as applications, switches, routers, databases, firewalls, operating systems, IDS/IPS and Web proxies.

With an emphasis on monitoring user activities rather than external threats, log management entered the market as a technology with an architecture to handle much larger volumes of data and with the ability to scale to meet the demands of the largest enterprises.