When the 'solution' is worse than the problem

03.05.2006

Without understanding vulnerabilities -- or any error condition or bug for that matter -- actual causes go undiscovered and unaddressed, administrators and developers don't learn from their errors, and quality invariably and inevitably declines.

Everyone makes mistakes, and a healthier organization might even make good fun of giving a "fat finger" award for the most interesting configuration error, or take a developer out for a plate of spaghetti when a convoluted or difficult bug is traced back -- served con funghi for missing specification changes, alla puttanesca for ignoring the spec altogether.

My son understand the meaning when I serve him an extra slab of cheese for a lame excuse -- it's all part of teaching them that it's better to admit and learn than to hide and lie. A tongue-in-cheek acknowledgement lets him know that while he may get a ribbing, he should not fear repercussions from admitting the truth.But burying the findings of a security assessment, or worse yet, breeding an organizational culture of fear around the discovery and reporting of security vulnerabilities has a much more insidious long-term effect than reducing quality.

Demoralization and dishonesty aside, it's downright dangerous because of the urge to modify the audit findings and reports. And as we're seen, reacting poorly to a small problem is an excellent way to create a large one that'll reach far beyond the IT department. Keep in mind that in the current regulatory environment, many IT security controls are closely connected to, and in some instances covered by, rules and regulations such as Sarbanes-Oxley or HIPAA.

For example, a public financial company's firewall configurations are now considered financial controls akin to any process or accounting application that keeps the right funds in the right place.