Posey's Tips & Tricks

Hardware-Level AI and Brainier CPUs: Predicting the Future of Computers

It's unlikely that machines will ever perfectly mimic a human brain, but next-generation computers could come very close. Here are some educated guesses as to what the future of computing could look like.

Someone recently asked me what I thought computers would be like 50 to 100 years in the future. That seemingly simple question was surprisingly difficult for me to answer. I still keep catching myself thinking about it days later.

One of the reasons why that question is so hard to answer is it assumes that there will only be one type of computer. Even today, there are a huge variety of computers. We have PCs, servers, mainframes, Internet of Things (IoT) devices -- the list goes on and on.

Therefore, to say something like "There will be a quantum computer on every desk" would be incredibly short-sighted. While I do expect quantum computers to eventually become a mature technology, I also expect that there will be computers that are more traditional. Even so, these computers will likely be based on an architecture that is far different from what we have today.

Although I think that it will be impossible for an electronic device to ever perfectly mimic the human brain, I think that computers will eventually adopt some of the brain's characteristics. One such characteristic is the heavy use of offloading. As strange as it sounds, the human brain actually does quite a bit of task offloading. For example, the brain offloads many of the tasks related to spatial positioning to the eyes. That isn't to say that the eyes have memory (they don't), but rather that the eyes reduce the brain's task load. Let me give you an example.

You probably know the layout of your home better than any other place on earth. If you were to close your eyes and try walking around your home, however, you would probably end up walking into something, and might possibly even hurt yourself. The point is that even though you know the layout of your home very well, your brain does not remember the spatial positioning of objects in perfect detail. Instead, it relies on your eyes to deliver real-time spatial data. That is an example of task offloading.

Today's computers already do some task offloading. For example, many network cards include TCP/IP offloading capabilities. In the future, however, I think that task offloading will be far more prevalent than it currently is.

I also believe that tomorrow's computers will likely be equipped with sensors that go far beyond anything that we have today. One of the things that makes modern smartphones so powerful is that the devices include so many different sensors that can be leveraged by applications. A modern smartphone can sense GPS position, light, temperature, sound, touch input and much more. Just imagine the types of applications that developers could create if computing devices were equipped with 3-D vision, thermal imaging or perhaps a spectrometer that could identify unknown substances. As electronic components continue to shrink in size and processing power continues to increase, having an increasingly exotic array of sensors embedded into a computing device will become ever more practical.

Another change that I expect to see in the future is the introduction of massively parallel processing (MPP) capabilities. Pretty much every computer being manufactured today includes multiple CPU cores, but I think that within two or three decades we will probably have CPUs that include hundreds of cores.

Of course, the way in which these cores are used will have to change. Today, having multiple cores is only beneficial if multiple threads need to be executed. I think that in the future we will see the micro-segmentation of execution threads, thereby enabling software to more easily leverage all of the available CPU cores. We will probably even see decision trees within software constructed in a way that assigns each branch of the tree to a separate CPU core.

Another fundamental change that will probably be made to future CPUs is the introduction of preprocessing capabilities. This is something that the human brain does really well, but that currently eludes most computers. Let me give you an example.

If someone were to throw a baseball at your face, what would happen? You would probably blink, move and maybe even try to block or catch the ball. What you wouldn't do is stop and think about your choices. This is an example of preprocessing; your brain senses danger and reacts accordingly without involving the normal cognitive thought process.

If the machines of the future are equipped with vast arrays of sensors, as I predict, then it is not unthinkable that some of the sensory input could be filtered through a preprocessing engine that allows the machine to react instantly to certain stimuli. We already see this type of preprocessing happening on some special-purpose computers, such as the ones that control the airbags in a car. As CPU-intensive technologies such as artificial intelligence (AI) and natural machine interaction become more prevalent, however, I think that preprocessing engines will be used as a tool for making computers much more responsive to user input.

I also think that CPUs of the future will include embedded AI engines. The HoloLens 2 already has a dedicated AI chip, so it is not unthinkable that something like this could eventually be integrated into a computer's primary microprocessor.

Incidentally, if computers of the future do include hardware-level AI capabilities, then I think that we can eventually expect to see a change in the way that memory is referenced. Today there are already databases that can store tables in memory. Why can't a computer's entire working set of memory function like an indexed but free-form database table?

There would be a couple of benefits to doing this. First, because the memory is indexed, it could allow data to be recalled from memory much more quickly. Second, it could allow the AI processor to begin to make associations between items in memory.

The end result could be a computer that is far more intuitive than anything that we have today, because it literally learns what is important to you as you use it.

Ultimately, I have no idea what the future holds for digital electronics. All I can do is speculate, using what I know about today's technology as the basis for that speculation.

About the Author

Brien Posey is a 19-time Microsoft MVP with decades of IT experience. As a freelance writer, Posey has written thousands of articles and contributed to several dozen books on a wide variety of IT topics. Prior to going freelance, Posey was a CIO for a national chain of hospitals and health care facilities. He has also served as a network administrator for some of the country's largest insurance companies and for the Department of Defense at Fort Knox. In addition to his continued work in IT, Posey has spent the last several years actively training as a commercial scientist-astronaut candidate in preparation to fly on a mission to study polar mesospheric clouds from space. You can follow his spaceflight training on his Web site.


  • Microsoft Releases CodeQL for Detecting Solorigate Tampering

    Microsoft announced on Thursday that its CodeQL queries, which were used to detect possible compromise in its source code after the Solorigate attacks, are now publicly available at the GitHub repository.

  • Microsoft Bumping Up SLA Support for Azure Active Directory B2C Service

    Microsoft had lots to say this month about its Azure Active Directory service.

  • Black Sky White Cloud Graphic

    Microsoft Expands Cloud Programs for Specific Industries

    Microsoft on Wednesday described an expansion of its industry-specific cloud efforts by announcing three new program additions, centered on the needs of finance, manufacturing and nonprofit organizations.

  • Reusing Content Within Microsoft Word

    A new Microsoft Word feature lets you insert a block of text (or other content) from a different file without leaving the document you're currently working on.

comments powered by Disqus