When the 'solution' is worse than the problem

03.05.2006
If my school-age son lies about a mistake he's made, it reflects poorly on me. After all, it's my job to teach him to own up to his actions, learn to be more careful with fragile objects, and not leave all his homework to the last day of vacation, and to make it right -- if not now, the next time.

It's a shame then, that it's rare to find a good understanding of that same principle among computer operations or development managers.

The scene replays all over the corporate world: It's time to wrap up the findings from a network or application security test, and the assessor -- a tester or perhaps an independent consultant -- says that there are unpatched security vulnerabilities. After an uncomfortable silence throughout the conference room, someone from the team quietly says that that's terrible, just terrible. Is the in-house team unnerved by the risk of losing data? Getting 0wn3d?

No, it's awful news because the team will lose their bonus or perhaps even be penalized if the group hired to detect bugs or vulnerabilities actually finds some.

Inevitably there's a request to change the report -- either redefining terms to hide the risk under lower ratings, or triaging the vulnerabilities so that none of them qualify as an imminent risk.

Sometimes the response is even more hostile towards an outside security consultant; clients will claim that the findings are bogus, the assessor is incompetent, or that the methods were unusual and substantively incompatible with the organization's risk management.