Doug's Mailbag: Windows 8 Going 128?, More
A rumor that Windows 8 will be 128-bit has been making the rounds recently. A couple of readers ponder when -- if ever -- we would need all that computing power:
I'm sure Microsoft is planning 128-bit at some degree, but it will be a LONG time before anyone would even consider that much address space. 64-bit is 18 exabytes. One EB is 1 billion GB, so 18 EB is 17.2 billion GB after you divide it by 1,024 each time! I personally don't know of any enterprise customers coming anywhere close to that kind of memory usage, so I can't imagine going to 128-bit already. 128-bit is 3.4e+38 which is 3.4 with 38 zeroes behind it. Calculator had to make it scientific notation because it couldn't fit the number in its display; heck, I don't even know what number that would be. For reference, it goes kilobytes, megabytes, gigabytes, terabytes, petabytes, exabytes, zettabytes, yottabytes. One yottabyte has 25 zeroes in it, so 128-bit will be some number far larger than a yottabyte. We would have to start measuring things using a larger standard such as how NASA uses the light year versus miles or meters to measure the universe.
Today, most 64-bit computers are using Microsoft's implementation of x64, but it is only the tip of the iceberg. Microsoft's implementation of x64 is able to address 16 TB of address space compared to the 18 EB in true 64-bit, which is less than 1 percent of true 64-bit. Microsoft's implementation of x64 allows each process to address its own 8 TB of virtual address space. Once we start filling that up, I'm sure true 64-bit systems or a new implementation of it will be out allowing each process to address its own 8 EB (8 EB for the kernel and 8 EB for each process) of virtual address space. We don't even have hard drives that can get to 1 PB let alone exabytes, so this is why I would be surprised if 128-bit is really being considered so soon.
It's not clear to me what I would need with a 128-bit desktop machine, but one never knows for sure. For example, back in the 1970s and 1980s, IBM's System 34 and System 36 mini-computers had machine instructions that would add or subtract binary numbers up to 2,096 bits in length. Makes 128-bits seem downright puny by contrast.
The principal reason for a processor with a 128-bit instruction set is addressability to very large memory and data storage addresses (what the techies call "I/O addressability"). Another reason is the ability to expand the number of operation codes (e.g., load address, left shift, right shift, test under mask) a processor can support. Yet another reason is the number of general registers an operating system can use for multi-tasking. If one needs to do floating-point arithmetic for very large or very small numbers to a high degree of accuracy, a 128-bit instruction set makes this possible.
However, a 128-bit processor may operate slower than a 64-bit or 32-bit processor if the application programs do not need these features. I could explain all of this in greater detail, but your eyes may glaze over after the first sentence. This is microcode and assembler language stuff.
Meanwhile, a few more of you just aren't buying it:
Come on, Doug -- get real. I've been reading you for lots of years (I was an Amiga zealot for WAY too long). But 128 bits? Have you heard ANY rumor whatsoever about any chip manufacturer releasing a 128-bit chip? And why? What is 128 bits going to give you?
Pshaw. You've been taken in by a someone making a joke. Albeit one more appropriate for April 1, but a joke nonetheless.
So, has someone been duped? See "Windows 8 to Be 128-Bit? No. Good God, No."
So, will IT bounce back in 2010, like Forrester predicts? These responses give a mixed outlook:
Forrester is probably right. The SMBs that I have worked with have kept chugging forward with all of the IT projects they can afford. They want to be ready for the rebound and be at their best. I left a Fortune 500 in June because I found a job closer to home, but (more importantly) this company is forward-thinking. They are working for the future by working projects to expand vertical markets.
I have two clients I do programming for, one an SMB and the other another Fortune 100 subsidiary. The company I left was in a cut-everything mode. The Fortune 100 client has stopped all contract work, regardless of significant long-term cash savings. Based on these examples and since there are a lot more SMBs out there than Fortune 500 companies, I can see Forrester's report having significant merit.
Here at the University of California, there are pay cuts and layoffs this year in IT -- big layoffs, "temporary" pay cuts. The feeling is that both will continue in 2010, and maybe longer.
Check in on Friday for more reader letters, including Vista solutions and favorite mobile OSes. Meanwhile, share your thoughts by e-mailing Doug at [email protected].
Posted by Doug Barney on 10/14/2009 at 1:17 PM