Lack of tests could block virtualization

03.12.2004
Von Patrick Thibodeau

Server virtualization technology is expected to play a big role in increasing CPU utilization rates on x86-based servers in the next few years. But attendees at Gartner Inc."s data center conference in Las Vegas this week said one potential roadblock is the need to test packaged applications on virtualized systems. That issue could put the relationships between users and software vendors to the test if vendors are reluctant to troubleshoot their applications on servers running virtualization software, according to Gartner analysts and IT managers at the conference.

Tony Fernandes, vice president of technology infrastructure at Inventure Solutions Inc., the internal IT arm of Vancouver City Savings Credit Union in British Columbia, plans to begin testing Microsoft Corp."s Virtual Server software next year.

Fernandes said he expects to have to train his staff to perform some application troubleshooting tasks, and he views the testing process as an opportunity to find out if his vendors" use of the word partner rings hollow. "Partner is this great word, but how many spell it correctly?" Fernandes said. He added that he plans to give the application vendors he works with this message: "You say that I"m an important customer, so show it."

Fernandes and other users said they have two strategies for dealing with vendor resistance to testing. One approach involves training internal IT staffers to do the necessary work on virtualized servers. The other might be called the blunt-force method: threatening to take their business elsewhere. Fernandes said he thinks that in 95 percent of the cases at his company, he could find an alternative application vendor if necessary.

Conference attendees said troubleshooting software is most likely to come from point-solution vendors that develop specialized applications, often for specific vertical industries.

Many of those vendors are relatively small and don"t have the funding or expertise to test their software in virtualized environments, said William Miller, manager of computing services at Roche Diagnostics Corp., an Indianapolis-based maker of medical diagnostics equipment.

Miller plans to conduct his own tests of third-party software on virtualized servers and then seek help from vendors if there are problems. He said he will deliver a message similar to the one that Fernandes has in mind: "Support me, or I"ll go find another point solution."

Application support by third-party software vendors is "the main issue" in adopting virtualization technology, said Luis Franco, vice president of technology at Banesco Bank in Venezuela. Some vendors "don"t want to assure the quality of their applications" on virtualized servers, Franco said, adding that the bank needs to increase the skills of its own personnel as a result.

In the long run, though, application vendors may have little choice other than to make the adoption of server virtualization software as easy as possible for users.

Gartner analyst Tom Bittman predicted that the average CPU utilization rate on two-way Wintel servers will increase to 40 percent by 2008, up from about 25 percent now. The rise will be partly driven by an increase in virtualization offerings, including Microsoft"s Virtual Server, he said.

Some users believe that x86-based servers are so inexpensive, there"s no point in buying virtualization software for them, Bittman said. But he argued that users may be spending more on x86-based servers as a whole than they do on mainframes or Unix systems.

The low-end servers also generate a lot of heat because of their increasing CPU power and density, contributing to cooling problems in many data centers, Bittman said.

Dave Mahaffey, technical systems administrator at the Santa Clara Valley Water District in San Jose, said he was at the conference to talk to vendors and research virtualization issues. "We"ve got a server for every application, and it"s getting out of hand," Mahaffey said. He added that he wants to consolidate servers and increase their CPU utilization rates from the current level of between 15 percent and 20 percent to as much as 50 percent.