Posey's Tips & Tricks

How Console Gaming Paved the Way for Desktop-as-a-Service

The earliest gaming consoles provided a tidy solution to the problem of PC hardware compatibility. Now, Brien argues, DaaS is doing the same thing.

When I was a kid growing up in the 1980s, I really enjoyed playing PC games with my friends. Even so, the concept of gathering at someone's house to play a game wasn't quite as simple as it might sound.

Back then, there were major issues with hardware/software compatibility that had yet to be addressed. Modern Windows OSes take advantage of OS-level device drivers, but back then, device drivers existed at the application level, not at the OS level.

This meant that each application vendor had to write device drivers that would allow the application to be played on various hardware configurations. Before you could use a piece of software, you would typically have to work through a short configuration process that involved selecting your sound card, graphics card and whatever other hardware components the software needed to use.

The big problem, of course, was that not every piece of software worked with every piece of hardware. Although one of my friends had a PC that was far superior to what any of the rest of us had, there were several occasions when we had to play a game at someone else's house because a particular game didn't work with his video card.

Sorting out these types of compatibility issues could be frustrating at times, but we usually made it work one way or another. One day, something happened that changed everything: One of my friends bought a Nintendo Entertainment System (the original 8-bit NES).

As my friend proceeded to demonstrate a game called Duck Hunt, there were two things that immediately caught my attention. First, the Nintendo graphics and sound were superior to that of most of the PC games of the time. Second, the games just worked. We didn't have to fool around with a setup program, or wonder if a game was going to be fully compatible. We just popped a game into the console and turned on the power. That was it. No frustration. No tinkering. Just gaming bliss.

Later on, of course, several of us purchased Nintendo consoles. We soon came to the realization that it didn't matter whose house we went to; every NES system played the games in the same way. Hardware compatibility simply was not a factor.

Nearly as impressive to me was the console's reliability. If a game didn't load correctly, you could always resort to the old trick of blowing on the cartridge (although I'm not sure that actually did anything). Regardless, the old Nintendo cartridges were far more durable than the disks of the time, and I never saw a Nintendo game come to a grinding halt because the device had run out of memory or because its CPU was overloaded.

Over time, I began thinking a lot about all of the ways in which console games were superior to PC games. As I did, I began to wonder if there might be a way to standardize PC hardware to ensure that PC software -- not just games -- would run just as reliably as a Nintendo game. I envisioned a system in which you could simply insert a disk and the software would run exactly the way that its authors intended.

Believe it or not, the PC industry did attempt to introduce the concept of a standardized platform at one point. No, I'm not talking about modern Docker containers, although containers do address many of the issues that I have talked about.

Instead, it was the Multimedia PC initiative. The idea was that any PC that met certain standards could be considered to be a multimedia PC, and would therefore be capable of running software that had been certified to be multimedia PC-compatible.

There were actually several versions of the multimedia standard before it was eventually abandoned. The third generation of the standard, for example, required the PC to have a 75 MHz or faster Intel processor, 8MB of RAM, a half-gigabyte hard disk, a 4x speed CD-ROM disk and support for playing MPEG files.

Even though the idea of standardizing the PC in a way that would ensure software compatibility never really caught on, I think that we are seeing it happen right now thanks to desktop-as-a-service (DaaS). Although virtual DaaS PCs aren't necessarily standardized, DaaS does accomplish one very important task: For the first time ever, it is practical for everyone in an entire organization to run their desktop OS on identical hardware.

In an organization that uses physical PCs, someone will inevitably end up with a PC that is newer than someone else's and runs a different hardware configuration. DaaS makes it possible to standardize virtual desktop hardware. That way, admins can rest assured that if an application runs on one virtual desktop, it will run on all of the others, as well. And when its time for a hardware upgrade, DaaS makes it possible to give everyone a performance boost at the same time.

I have to admit that I haven't always liked the idea of using DaaS. For whatever reason, I just like having a physical PC that runs a hardware configuration of my own choosing. In a corporate environment, however, DaaS does have its place. Thanks to the way that DaaS standardizes virtual desktops, the technology will likely drive down desktop support costs.

About the Author

Brien Posey is a 22-time Microsoft MVP with decades of IT experience. As a freelance writer, Posey has written thousands of articles and contributed to several dozen books on a wide variety of IT topics. Prior to going freelance, Posey was a CIO for a national chain of hospitals and health care facilities. He has also served as a network administrator for some of the country's largest insurance companies and for the Department of Defense at Fort Knox. In addition to his continued work in IT, Posey has spent the last several years actively training as a commercial scientist-astronaut candidate in preparation to fly on a mission to study polar mesospheric clouds from space. You can follow his spaceflight training on his Web site.

Featured

comments powered by Disqus

Subscribe on YouTube