A software security chat with David LeBlanc

16.09.2005
Von Roger A.

Last week I summarized the difficulties in preventing buffer overflows in complex software and introduced fuzzers. With multiple buffer overflows announced every week on some of the world"s most popular products, I want to continue the discussion on the practical realities of vendors trying to prevent them. If the biggest software companies, with their vast resources and talent, have a hard time preventing buffer overflows, can any complex software product be built without security flaws?

Again, I turned to buffer overflow expert David LeBlanc for a deep discussion of the issue at hand. David was a chief bug finder for Microsoft for six years and is now working at Webroot as the chief software architect. Here"s what David had to say about the difficulty of flawless code-writing:

"Software bugs are measured in total number of defects per 1,000 lines of code [kloc]. Think of an application as having two measurements -- the first is total number of defects per kloc, and the second is the percent of all defects that have security implications. A typical programmer with typical development practices will have around 50 defects/kloc. A really good developer, aided by peer code review, can drive the defect rate down to five defects/kloc. Also, note that if we learn to recognize a new type of defect, the count could go back up.

"That"s still 5,000 defects per million lines of code, and a complex application can run up to around 3 million to 5 million lines. Operating systems and associated helper apps and services can run up into around 50 million lines. A typical utility app will run around 250 kloc.

"Code review isn"t perfect because we"re relying on people. They"ll be less effective before the coffee kicks in, less effective right after lunch, and it will vary from person to person. People also have problems maintaining state, and will tend to notice bugs that are local, and won"t often find bugs where the mistake was three function calls ago and the consequences are local. So code review is great, but not perfect.

"Next, we have automated tools -- these are getting better all the time. These are great; they"re consistent, but there are limits to what they can be taught to find.

"Both of the previous measures will tend to reduce defects/kloc without regard to exploitability. A good development team will eliminate defects without bothering to see if they are exploitable. There were several instances where Office 2003 was not vulnerable to a problem reported in a previous version because someone cleaned up some code without realizing they were actually fixing an exploitable condition. Many other projects, notably OpenBSD, advocate the same approach. Theo de Raadt has long advocated fixing bugs without regard to exploitability.

"Next, we start getting into mitigations. For example, if we have stack protection enabled, there will be some defects that were previously exploitable that are now not exploitable, others where it may be harder or that will force new techniques to be developed, and still yet others where the mitigation has no effect. The rule here is that for any given mitigation, there will be some defects that become completely non-exploitable, and for a sufficiently complex set of attack conditions, any given mitigation can be circumvented. An important thing to note here is that these often overlap, so multiple mitigations increase effectiveness quite a bit.

"We also have trade-offs -- for example, we could write everything in language that is built to prevent buffer overflows, like Visual Basic, where all the buffers are checked. No more buffer overruns. But performance will tend to suffer, as it would if you programmed completely in C# or Java. There"s an inevitable CPU and memory hit for that additional checking.

"Plus, while buffer overruns may be a significant problem, they"re far from the only type of security problem. Security is always a trade-off. I might go through the code replacing strcpy with strncpy. I"ll manage to remove some of the overruns, but now the weak link will be my calculation of the sizes involved.

"To answer the naysayers of negativism, I"d like to quote Sen. Sam Ervin of the Watergate hearings: "Any jackass can knock down a barn, but it takes a carpenter to build one." Writing secure software, especially on a large scale, is a much harder problem than finding individual bugs."

There are probably many people who would like to argue with David"s comments for one reason or another, but I wanted readers to hear the voice of someone who was in the trenches and who has fought, and continues to fight, the good fight.

Attention, readers: Next week"s column will deal with poorly coded software applications which require administrative or root access to function. Send me ( roger_grimes@infoworld.com) your software product candidates for my "Wall of Shame."