The virtualization myth

25.04.2006
There are only a few markets ideally suited for virtualization. One of them is software development. As the scene is usually painted, the developer sits at his or her desk, compiles new software, and launches it in a virtual machine so that when it crashes, it doesn't take the whole box down.

We hear of developers who keep a Linux instance open on Windows, or vice-versa, either for the sake of cross-platform productivity or to strike a blow for religious freedom.

But as long as virtualization is viewed as a tool of convenience for individual developers, IT is not likely to stay very excited about it. The larger reality is, once we get to the point where we can assume that virtualization is a standard component of every OS, major changes with bottom-line impact will take place miles away from any one developer's desk.

As a former developer, I get excited about it. Now that I know I can do with virtualization what I always wished I could, I can't imagine architecting, developing, or doing QA on SOA and other distributed solutions of scale without virtualization as a core OS component on every machine my solution might touch.

When a development lead hands a project to QA, technical writers, tech support, or anywhere in any direction in the development chain, the project should always be passed along as a virtual disk image that's ready to roll.

That means the virtual disk image would have the OS configured with all of the application's dependencies in place, the application itself, and the sample data and scripts required to test it thoroughly.