Posey's Tips & Tricks
The Future for AI Adoption, Part 2: Chill
Once the "new" has worn off, look for Gen AI to continue with much less fanfare.
In my first article in this series, I explained that every time that I have ever seen a technology gain nearly universal adoption, there is a distinct chain of events that always seems to occur. As such, I wanted to take the opportunity to talk about that chain of events and where AI fits into the mix.
Whenever I have ever seen a technology gain mainstream adoption, it has been because the technology has a certain coolness factor, it has the potential to make life easier, or both. In at least some of these cases, the reason for a technology being broadly adopted is because it meets such an obvious need. Let me give you an example.
Back in the mid-1990s, Wi-Fi was far from being a mainstream technology. The Wi-Fi hardware at the time was difficult to find, extremely expensive, and you practically had to have a PhD in computer science to set it up. At that point in time, even “always connected” Internet wasn't really a thing yet. Most ISPs charged customers for every minute that they were connected.
Back then, I was one of the fortunate few who had Wi-Fi and high speed Internet (thanks to an ISDN line). One day, I had a friend visiting and we needed to look something up online. Without really thinking about it, I powered up a laptop and began to search the Web. It took a good 20 minutes for my friend to realize that not only were we surfing without wires, but also that I hadn't dialed into the Internet. The experience was so seamless that it just felt completely natural. In other words, the technology made life easier, but it also met a really obvious need.
One of the reasons why I mentioned this story is because it highlights where the technology adoption process often begins -- with early adopters. Early adopters are those who are willing to pay a high price for technology all the while knowing that they will probably have to jump through several hoops in order to make the technology work in the way that they want it to since the technology is brand new and far from being mature.
Early adopters are also risk takers. As someone who has spent most of my life writing about technology, I have always tried to be an early adopter. That way, I have the opportunity to figure out how various technologies work before the rest of the world catches up. The problem with being an early adopter however, is that often, products or technologies never catch on and you end up being left with a high priced gadget that has been discontinued.
As previously mentioned, when a new technology becomes wildly popular, it is often the early adopters that drive some of the initial enthusiasm. They might show off a cool new product to friends, family members, or coworkers. While this initial introduction might lead to some non-techies becoming curious about the technology, it probably won't directly lead most to adopt it. After all, the technology is most likely still immature and relatively expensive at that point.
The next step in the technological adoption process is that the new technology slowly decreases in price and starts to become easier to use. When that starts happening, the technology inevitably makes its way into the workplace. This can happen in a couple of different ways. Let me give you some examples.
When Wi-Fi first started to become affordable and easier to set up, there was a strong push from end users who wanted to be able to use their laptops without the wires. Corporate IT departments were reluctant to allow Wi-Fi adoption because of security concerns. What ended up happening in many cases is that users acquired their own Wi-Fi access points and deployed them without the IT department knowing about it. Eventually, IT realized that it was better for them to provide the users with properly configured Wi-Fi than for users to covertly deploy their own Wi-Fi with no regard for security.
Another example of technology making its way into the workplace happened soon after Apple released the iPad. High level executives at many companies demanded that they be allowed to work from the iPad instead of from a PC. The IT staff was forced to figure out how to integrate iPad use into the already established IT operations.
In one of these examples, a new technology made its way into the workplace shrough shadow IT. In the other case, a technology adoption was mandated. In both cases however, a relatively new technology eventually became sanctioned for use within various organizations.
This adoption, along with hype from the media, who is only now starting to catch on, ultimately leads to an ever increasing amount of visibility for the new technology, thereby leading to greater levels of adoption.
As this happens, the inevitable next step is that there becomes way too much hype around the technology. Vendors scramble to embrace the technology out of fear of being left behind. Think back to when cloud computing first started becoming popular. There was a point at which nearly every software vendor was scrambling to include some sort of cloud related feature in their product so that they could advertise the product as being cloud ready, cloud enabled, cloud compatible, or something to that effect.
This is the stage where we are at right now with regard to AI adoption. AI is well past the early adoption phase, it has made its way into the workplace and has been well established. Now, vendors are investing insane amounts of money into AI as a way of trying to remain relevant by AI enabling their products.
So what happens next? It's difficult to say for sure what the next step is, but in the past there has always come a point at which interest in overhyped technologies begins to wane. At this point, the product is old news. Copycat vendors or technologies begin to disappear or be acquired by larger companies. At the same time, those companies that are market leaders begin to work on next generation versions of their products. These next generation products are meant to address some of the frustrations and pain points associated with the first generation products, while also hopefully driving a renewed interest in the technology.
This is exactly how I expect the AI craze to play out. AI isn't going away, but I think that vendors who have incorporated AI into their products are going to start getting a better feel for where it does and does not make sense to use AI. Some AI features will inevitably disappear and be dismissed as failures. Others will probably start to become standards that are found in nearly every software product.
As a historical example, consider the evolution of ecommerce sites. At one time, online retailers were innovating like crazy because the industry hadn't firmly established what an online store should look like. Eventually, some of the crazier online store features began to disappear, while other experimental features such as star ratings for products or customer reviews began to be adopted by nearly all online retailers.
All of this is to say that I think that AI is here to stay and that it will become one of those technologies that we all use every day. However, I think that the ways in which vendors incorporate AI into their products will begin to make a lot more sense, as opposed to vendors adding AI features just for the sake of being able to say that their products use AI.
About the Author
Brien Posey is a 22-time Microsoft MVP with decades of IT experience. As a freelance writer, Posey has written thousands of articles and contributed to several dozen books on a wide variety of IT topics. Prior to going freelance, Posey was a CIO for a national chain of hospitals and health care facilities. He has also served as a network administrator for some of the country's largest insurance companies and for the Department of Defense at Fort Knox. In addition to his continued work in IT, Posey has spent the last several years actively training as a commercial scientist-astronaut candidate in preparation to fly on a mission to study polar mesospheric clouds from space. You can follow his spaceflight training on his Web site.