Posey's Tips & Tricks

The Potential Hidden Perils of Relying on Microsoft Copilot

While the technology aims make our day-to-day work lives easier, problems will arise if Copilot is switched to autopilot.

At this year's Microsoft Build conference, Microsoft, not surprisingly, focused heavily on Copilot. For those who might not be familiar with Copilot, it is essentially a conversational interface that works similarly to ChatGPT. Microsoft is integrating copilots into products such as Microsoft 365, Windows and the Edge browser.

The copilots that Microsoft has created are undeniably impressive, and I believe that they represent the sort of technological change that only comes along once in a generation.

Many have expressed concern about Copilot putting people out of work or causing other economic harm. While such fears may ultimately prove to be warranted, I believe that there is another potential problem associated with using Copilot that nobody is talking about yet. This is the problem of overreliance. To put it another way, we are bound to see mistakes being made because people placed more trust in Copilot than they should have.

Please don't misunderstand me. I'm not trying to bash Copilot. If anything, I think that the technology behind Copilot is amazing. Even so, Microsoft has said from the very beginning that Copilot is not perfect and has stressed the importance of reviewing AI generated content prior to accepting it. Even so, in the fast paced, deadline driven corporate world, it is easy to imagine situations in which overworked employees are under tremendous pressure to get projects done quickly and to "do more with less." As such, it isn't exactly a stretch to think that some will blindly accept Copilot generated content without taking the time to review it.

While the situation that I just described is probably the most likely way in which Copilot use might go wrong, there are other similar situations that might be even more problematic. Imagine a situation in which someone wants to review Copilot's AI generated content for accuracy, but lacks the knowledge to do so.

During one of the demos at this year's Microsoft Build conference, a presenter showed a legal contract that had been created in Microsoft Word. The presenter then used a series of plugins from Thompson Reuters to edit the document. First, the presenter asked Microsoft 365 Copilot to edit the Limitation of Liability clause using the Practical Law plugin. The presenter then used the Westlaw plugin to ask whether the Limitation of Liability was enforceable under California law.

When used properly, the plugins that Microsoft has demonstrated can significantly expedite the process of creating legal contracts. After all, the plugins can ensure that such documents are written properly and that they are enforceable under state law. As such, the organization's legal department should be able to complete the contract review process far more quickly because so much of the initial cleanup work has already been done.

The problem occurs when organizations begin using Microsoft Copilot as a substitute for a true legal review. The plugins may create an unintentional perception that anyone who knows how to use Microsoft Copilot can author legal documents without the need for a lawyer. The problem with this is the aforementioned knowledge gap. Suppose for a moment that a user with no background in law uses Microsoft Copilot to generate a legal document. Even if that person takes the time to thoroughly review what Copilot has written, they likely lack the knowledge to spot potentially problematic phrases within the document. Never mind that there will undoubtedly be organizations who make a conscious decision to use Copilot as a tool for saving a few bucks on legal services.

There was one other potential adverse side effect to using Copilot that came to mind during the Build conference, although this one is far more lighthearted.

During a demo, Microsoft explained that ChatGPT had been integrated into Bing and how ChatGPT plugins now also work in Bing. The presenter then showed the Edge browser open to a Web page containing a recipe. The presenter asked Copilot to create a list of the ingredients used in the recipe and then used a plugin to add those ingredients to an Instacart. Going from recipe to Instacart made for a compelling demo because it illustrated Copilot's ability to digest the contents of a Webpage and to put that content to work.

The thing that had me laughing about this particular demo was that the presenter didn't bother to read the recipe (at least not during the demo). The demo made it seem as though the presenter had stumbled across a recipe for Peach Melba cake and made an impromptu decision to make one. I can just imagine the Instacart items arriving and the recipient saying something like, "what's with the raspberries? I didn't know this recipe called for raspberries. I hate raspberries!"

All of this is to say that while I believe that Microsoft Copilot will allow us to do some truly amazing things, it will be critically important to review Copilot's work and to resist the temptation to simply assume that Copilot has gotten everything right. Otherwise, there may be consequences ranging from legal issues to a peach cake that unexpectedly contains raspberries.

About the Author

Brien Posey is a 22-time Microsoft MVP with decades of IT experience. As a freelance writer, Posey has written thousands of articles and contributed to several dozen books on a wide variety of IT topics. Prior to going freelance, Posey was a CIO for a national chain of hospitals and health care facilities. He has also served as a network administrator for some of the country's largest insurance companies and for the Department of Defense at Fort Knox. In addition to his continued work in IT, Posey has spent the last several years actively training as a commercial scientist-astronaut candidate in preparation to fly on a mission to study polar mesospheric clouds from space. You can follow his spaceflight training on his Web site.

Featured

comments powered by Disqus

Subscribe on YouTube