Out With the Old, in With the New

19.01.2009
Late last year, just when it seemed that every slimy rock on Wall Street had already been turned over, came news of the Mother of All Ponzi Schemes -- the apparent disappearance of $50 billion at the hands of .

The mind boggles at such sums. And yet, that's small potatoes compared to the pyramid scheme that was built around . I used to wonder at the megacompensation of those "managers" on Wall Street. How could so many make so much, I wondered, doing things that seemed to contribute so little to society?

I chalked it up to my ignorance, to my having gone to business school long before things like credit-default swaps existed. If someone got a $10 million bonus for doing something, that something must be pretty important and useful, I figured.

A mortgage is a thing of value -- to the home buyer, to the lending institution and to society. But mortgages were packaged and sold, and then repackaged and resold again and again until they were buried in deals so complex that neither their buyers nor their sellers completely understood them.

But so what? Risk and "leverage" (debt) were in fashion, and every party at each step made big bucks. Never mind that no real value was created at most of those steps.

So, what does this have to do with IT? Last year, researchers at the Wellcome Trust Centre for Neuroimaging at University College London pinpointed a part of the brain, called the ventral striatum, that is the locus of people's craving for the new and unfamiliar. It predisposes people to take risks even when there is little logical basis for doing so. I think it was at work on Wall Street, and I think IT managers are often driven by it as well.

The history of IT since the 1970s can be summed up as one giant quest to find a better place to put stuff -- hardware, software and data. First it was on mainframes, then on clients and servers, then in N-tier arrangements; then it was outsourced or offshored, then "virtualized," then put on external Web servers, then moved into the . There were even a few flashbacks: Mainframes were said to be back in style; thin clients were in, then out, then back again. Every year or so, it seems, someone comes up with a "better" idea of how to slice and dice computing's resources.

Here's how it works: An IT manager has a certain computing infrastructure in place for his company. It works OK, maybe better than the last thing he had, but of course it does have some problems. Onto the stage stride the vendors, the analyst blowhards, a few peers and maybe a few users, all saying they have a better idea. Indeed, the IT manager needs a better idea in order to deflect criticism for that downtime last week and to justify a budget increase. Plus, his ventral striatum says new things and risk-taking are good.

Our hapless IT manager finds that changing to the new idea is much more expensive and painful than anyone expected and that not all the promised benefits are realized. But at least he has a new "platform" on which to catch his breath until the next big thing comes along.

Am I suggesting that we IT people -- who live on the leading edge almost by definition -- should run away from the next big thing? Should we emulate our financial brethren, so recently drunk on risk and leverage, who are now afraid to get out of bed? No, I'm just suggesting that new computing paradigms always have a certain faddish quality, and they entail risks you don't always have to take. Caution is back in style.

So this would be a good year to make what you have work a little bit better -- a good year to invest in training, procedures, documentation and other boring things. Then, by 2010, you'll be ready for the next silver bullet.

Gary Anthes is a Computerworld national correspondent. You can contact him at gary_anthes@computerworld.com.