How did I miss that?

09.01.2006
Success in IT, as in any field, is all about focus. But in this month's Harvard Business Review, Max H. Bazerman and Dolly Chugh posit that focusing too tightly can cause you to miss critical information that's right under your nose. From the Challenger disaster to the Vioxx debacle, bad decisions often can be tracked back to a failure to consider information that was readily available. Bazerman, a professor of business administration at Harvard Business School, talked with Kathleen Melymuka about how to take the blinders off.

What is "bounded awareness"? It's the tendency to fail to see critical information in our environment because we're overly focused on some subsegment of what's out there. We're so focused on a specific task that we miss other information that's extremely relevant.

In your article, you give an example from the Ulric Neisser study. He has two videos transposed on top of each other. One has three players in white T-shirts passing a ball. The other has three players in dark T-shirts passing a ball. Since they're superimposed, they never pass between the colors. People watching are given the task of counting the number of passes among the white T-shirts. As the film goes on, a clearly visible woman with an open umbrella walks through the frame. She is so visible that normally everyone in the room would see her, but when they're busy counting, the vast majority of people don't. In Neisser's study, only 21 percent saw her. My experience with executives is closer to 3 percent. Neisser was looking at what people fail to see literally, but we're looking at what people fail to see figuratively.

You write about several causes of bounded awareness. The first is the failure of decision-makers to seek information. On the face of it, that sounds silly. It does, but there are situations where people use the information in the room where they should be identifying information they need to make this decision optimally. One classic example is the Challenger disaster, where decision-makers at NASA didn't ask for relevant information to analyze whether low temperature related to O-ring failure. They used the information that was available.

How can a CIO avoid that kind of error? See Neisser's video, or go to [http://viscog.beckman.uiuc.edu/djs_lab/demos.html], where my colleague Dan Simons has 12 of these visual illusions. I find it's very valuable for executives to see a visual illusion. It tells us that there are things going on in our minds we really don't understand. If I were to tell a CIO, "There's important information out there, and you're missing it," he would probably say, "And your evidence would be what?" It's useful to unfreeze people with the visual illusion. Then they can more readily see that there are situations where smart people miss opportunities to bring the right information to a decision.

You write that another cause of bounded awareness is failure to use information. Can you give me an example? In many companies, the information is there, but somehow it doesn't get used. In the Vioxx story, it's clear that information about the medical risks existed at Merck long before the public became aware. It seems they had it but didn't use it.