Allchin dispels myths on 64-bit computing

18.04.2005
Von Carol Sliwa

Jim Allchin, group vice president of platforms at Microsoft Corp., wants to dispel the notion that 64-bit computing is helpful only for big database servers and computer-assisted design (CAD) applications. Allchin last week extolled the benefits of 64-bit computing during an interview with Computerworld. Microsoft is expected to release 64-bit editions of Windows XP and Windows Server 2003 this month. Part 1 of the interview is available online. Excerpts from Part 2 follow:

What new capabilities will users gain with 64-bit computing? The 64-bit world is very significant in my view for a number of reasons -- most of which people don"t understand. ... First, x64 supports 128GB of RAM and 16TB of virtual address. What this means is you could actually apply a significant amount of memory to one of these machines. And you could keep everything that you"re dealing with in memory. It changes the whole paradigm.

(With) videos, if you looked at how much actual data is on the machine, it probably isn"t that much. If you could put it all in memory, things that you might have thought would be hard before will become easy, because you can search and you can tie pieces of information together in such a simple way, because you can just use brute-force approaches. The primary memory is thousands plus times faster than accessing the disk. So if you keep it in memory, you can get that kind of performance improvement when you"re trying to search, particularly when you"re doing random-type searches. ... So I think you will see more and more primary memory being added to the systems. ... You have interesting things that you can do with that memory, (but) the applications haven"t evolved yet because they haven"t seen that opportunity.

Point 2 is that the transition will be much easier ... because the 32-bit world is flat memory, the 64-bit is flat memory, and it"s fairly easy to just recompile and make some minor changes and you can have a 32-bit app for most apps.

You also get benefits even if you don"t need the memory. I think most people think that 64-bit is only if you have a big database or you have a workstation-type app or CAD app. But even here in town there happens to be a music recording company called Cakewalk, and they converted their multitrack recording system to run on x64 on top of the Intel EMT machine, and they got 40 percent improvement. That wasn"t because they needed the memory. They got it because of the way registers are handled in the calling conventions, in the floating point. So you get benefits beyond just having the raw amount of memory by moving to this architecture.

Another advantage that I see deals with security in that 64-bit has "no execute" on by default, and that means you have an additional level of security. Not perfection -- but an additional level of security for marking data segments as not being able to run code. So it means certain attacks to the stack aren"t possible. We tried to do this a little bit with SP2 for the 32-bit world, but it doesn"t work anywhere near as easily as in the 64-bit world.

What is it about 64-bit that facilitates that? Why was it more difficult with 32-bit? You have to be able to mark the page entries as having "no execute" from the physical hardware. The way the chip works in 32 bits, you had to go into what was called PAE (Physical Address Extension) mode. And PAE mode changes the way the page tables are managed so that you have that bit to be able to say it"s "no execute." What that means is you had to boot the system in a PAE mode, which is a mode that we had that goes back to, geez, maybe Windows 2000. I"m not sure when we put that in there. That was the only way you could get "no execute" support on the 32-bit machine versus the 64-bit. It was designed in from the beginning. All the page descriptors have a way to say "no execute" on it. The normal system can just mark everything ... without booking it in some special way. I probably didn"t do a perfect job in explaining it.

PAE mode? It was a mode I think we created so we could get 3GB in the app space and 1GB for the OS space. You only had 4GB to play with in the 32-bit world, versus 16TB in the 64-bit world. So, big difference there. We had 4GB to play with, and the typical way XP works today is 2GB for the OS, 2GB for the apps. This is just virtual address space, not physical. The only way you get "no execute" is if you gave 3MB to the app and 1MB to the OS, virtual address space. So it required you to go into a separate special mode in order to get it. But it"s native into the 64-bit.

Another security aspect in x64 is we have a way to prevent malicious changing of OS code. Somebody takes a driver and installs it in the system, you would think they"ll just change the page table because they"re running it as the core of the operating system. They could just go change the page table and then they can write their own instructions where Microsoft instructions were. The way we"ve changed the x64, they won"t be able to do that. We"ve made it much more difficult.

Another aspect that I think is consequential is 32-bit apps running on a 64-bit OS. They just run. And they run typically faster. And so the compatibility level is very, very high.

Another thing that I think will help the transition here is the fact of price. We"re not charging anything more. The processor vendors are not going to charge anything more. It just makes it an easy transition. ... At the end of this year, I think it will be very difficult to buy a processor that"s not 64-bit for servers. Publicly 40 percent, 50 percent of the clients from the processors coming from the manufacturers will be 64-bit. By the end of next year, it will sweep through the mobile and sweep through many of the others. So there may be some residual left as we go out the end of next year but it"s going to happen pretty quickly.

Will 32-bit applications get a performance boost running on 64-bit Windows? We"ve done a bunch of tests. Most of them show a performance gain. You"re not going to see some huge performance drop. OK. That"s the thing you should baseline. What you will see typically is a little bit of performance gain. And on the 64 bits, you"ll see a bigger jump. And what"s great is you can have "em on the screen at the same time. We"ve really perfected this. If you remember the terminology we used WOW -- Windows on Windows -- that was the 16-bit running on 32-bit. We"ve gotten really good at how to do this. So 32-bit on 64-bit, we"ve gotten really good. The compatibility is very good.

What sort of performance increase will users see for a 32-bit application running on 64-bit Windows? Perhaps 5 percent to 10 percent? Yeah. It"s small. ... It dramatically depends on how much (the applications) call the OS. The more they call the OS, the more gain they"ll get, because the OS is taking advantage of this.

Will there be separate 32-bit and 64-bit versions of Longhorn? We"ll have both.

What"s the Windows release cycle going forward? We"re just listening and learning, if you will. We want to get back to where we"re doing major releases every three to four years and minor releases every one to two years. We did it with XP. We did about 18 months. And you could say we did it again with SP2, because that was a pretty consequential release. These are some of the things that we have done to get us in a position where we can hold parts of the system constant and move other parts ahead. For example, a minor release might not have any kernel improvements and we may just freeze that -- or maybe not. We can choose, because we"ve got a lot of what"s going on, architectural layering, componentization, that helps us. When I say quality focus, do it right the first time, I don"t mean necessarily through the customer. I mean internally that we"re trying to find the issues and correct them at the developer"s workstation first, if possible, maybe in the feature branch, before it"s integrated with the virtual build lab. We"re trying to catch the problems early before they end up making problems for others. We"re trying to keep it as isolated as possible.

Then we have a significant change in terms of our development tools and test automation. We moved a significant amount of people from the research organization, who were doing production-type tools for software development. They"ve been helping for the last 15 months or so improve the processes and engineering that we"ve got for building the product. We also organize the team just on engineering excellence and their goal under (Senior Vice President of Windows) Brian Valentine is to ensure that they are driving this type of thinking through everything that"s going on in Windows and throughout the platforms group.