AI & IT: What's Up with Microsoft Copilot? A Q&A with Brien Posey
With all the FUD surrounding Copilot, we asked Microsoft MVP and Redmond columnist Brien Posey to answer some questions and help us cut through the hype surrounding the upcoming tech.
Microsoft announced Microsoft 365 Copilot in March (not yet released; no date yet). There's also GitHub Copilot for programmers (released in June; $10-$19 per month), and Microsoft's version of Github Copilot for Visual Studio programmers, VS Code Copilot (also not yet released). And, on top of that, Microsoft has announced Microsoft Security Copilot (again, not yet released).
With all these versions of Copilot, limited beta availability and no official release dates, there's a lot of confusion around Copilot -- what it is, what it isn't (and when we'll get it). We asked popular Redmond columnist and Microsoft MVP Brien Posey, who's been keeping up on Microsoft's AI initiatives from the standpoint of practical implementations for IT pros, what he knows, what he expects and anything else he could shed light on. Here's that interview:
Redmond: For IT folks who haven't been able to follow the news fully, can you explain more what Copilot is and how it will be integrated into Microsoft products?
Posey: At its simplest, Copilot is a GPT-4 chatbot, much like ChatGPT. Even so, calling Copilot a chatbot is at least somewhat misleading. The reason why I say this is because chatbots are typically informational in nature. In the case of ChatGPT, for example, you can ask it questions and it provides an answer. While Copilot can indeed answer certain types of questions, it takes things a step further than a typical chatbot by building content based on your request.
Another key difference between Copilot and other chatbots is that Copilot is integrated into the various Office applications and exposes functionality related to those applications. You might, for example, ask Copilot to analyze an Excel spreadsheet or turn a Word document into a PowerPoint presentation.
As impressive as such capabilities might be, there is another aspect to Copilot that is far more profound. Copilot largely eliminates the learning curve that is associated with the Office applications. Yes, the Office applications are designed to be easy to use and most people are familiar with the Office user interface, but relatively few people use the Office applications to their full potential. Copilot will make it possible for a complete novice to use all of the advanced features in Office, even if they don’t even know that those features exist. Remember, with Copilot a user needs only to ask Office to do something; they don’t actually have to know how to perform the task themselves.
Are Microsoft 365 Copilot and Microsoft Security Copilot (and GitHub Copilot) all the same thing, or are they all different things with just the Copilot nomenclature?
Microsoft 365 Copilot, Microsoft Security Copilot and GitHub Copilot/VS Code Copilot are all different things, though they do share some architectural similarities on the back end. Microsoft seems to be trying to build brand recognition around the Copilot name. The company did something similar with Defender a few years ago. Another example might be how Microsoft has attached the System Center brand to several products (Virtual Machine Manager, Data Protection Manager, Operations Manager, etc.) even though the various System Center products have relatively little in common with one another.
As its name suggests, Microsoft 365 Copilot is the version of Copilot that is being integrated into Microsoft 365. As of right now, it seems as though Microsoft thinks of Microsoft 365 Copilot as its flagship Copilot product. Microsoft Security Copilot is a tool that is designed to ingest security logging data and provide actionable insights based on a security admin’s natural language queries. Copilot for GitHub is geared toward developers and writes code based on natural language prompts.
Are there other Copilots coming?
To the best of my knowledge, Microsoft has not announced any additional Copilots. Having said that, however, it seems extremely likely that we will see some additional Copilots announced. In fact, three future Copilots are almost a certainty. For starters, Microsoft has AI-enabled its Bing search engine and its Edge browser. Although these have not yet been branded as Copilot, I fully expect Microsoft to apply the Copilot branding.
I also expect to see Copilot added to the forthcoming Windows 12. Microsoft showed interest in integrating an AI assistant into the Windows operating system years ago when it introduced Cortana. I fully expect Microsoft to replace Cortana with Copilot in Windows 12.
"I think that the most exciting thing about Copilot is the way that it will allow anyone to use the Office applications to their full potential. Users won’t have to worry about using YouTube videos to figure out how to perform unfamiliar tasks in the Office applications. They can simply ask Copilot to do the task for them."
Brien Posey, Microsoft MVP and Redmond columnist
Do you know exactly what's powering Copilot? Is it just ChatGPT 4? Is Microsoft Graph in there? Any other solutions?
The back-end mechanisms that are driving Copilot tend to vary based on the Copilot that is being used. Security Copilot, for example, needs access to security event data that Microsoft 365 Copilot does not need. If you put those differences aside, though, there are three main components at work in Copilot.
The first of these components is the application itself. In the case of Microsoft 365 Copilot, this would be Word, Excel, PowerPoint, Teams and the other Office applications.
The second component used is Microsoft Graph. For those who might not be familiar with Graph, it is essentially an API platform that gives developers a way of interacting with Microsoft 365 applications and data. In addition to the Graph API, Microsoft has also created a Graph Connector that allows connectivity to external data sources such as non-Microsoft SaaS applications. Graph also includes a component called Microsoft Graph Data Connect that makes Graph data accessible to Azure data stores. In other words, Copilot is more than just ChatGPT bolted onto Office. Microsoft Graph gives the underlying AI engine the ability to interact with Office at the document level.
The third component used by Microsoft copilot is the GPT-4 large language model, which is often referred to as LLM. This is the component that parses the text that the user enters and turns it into something that the AI engine can understand. It is also responsible for turning Copilot’s output into human readable text.
What functionality will Copilot be bringing to Teams?
Microsoft has not yet released Copilot so we do not yet know everything that Copilot will be able to do within Teams. Even so, Microsoft has given us a preview of some of the things that Copilot can do.
One of the more useful things that Copilot can do for Teams users is to summarize a meeting that is still in progress. Imagine for a moment that a user arrives late to a meeting. That user can ask Copilot what they have missed so far and Copilot will generate a concise, bullet point summary. That way, the user can catch up without having to disrupt the meeting by asking what they have missed.
Copilot also has the ability to identify issues that were discussed in the meeting, but that are unresolved. This can help to prevent important items from being forgotten about.
Another thing that Copilot can do is to help to gauge sentiment. You can ask Copilot how the group feels about a topic that has been discussed and Copilot will provide insight based on the conversation. This functionality may help to give you an edge when a meeting involves negotiation because there is a chance that Copilot may be able to pick up on subtle clues that reveal how a meeting attendee really feels about an issue.
What are you most excited about with this technology? Is there anything you're not looking forward to?
I think that the most exciting thing about Copilot is the way that it will allow anyone to use the Office applications to their full potential. Users won’t have to worry about using YouTube videos to figure out how to perform unfamiliar tasks in the Office applications. They can simply ask Copilot to do the task for them. Copilot also seems to have inherent artistic abilities, and so Copilot will be able to help the artistically challenged among us to create better looking PowerPoint presentations and documents.
In spite of its many benefits, I think that there will be at least one unintended consequence to using Copilot. My guess is that Copilot will place even more pressure on office workers to get tasks done quickly. Managers will know that Copilot has the ability to expedite the document creation process and will likely become less tolerant of tasks that take a long time to complete.
Should front-line IT folks be worried about losing their jobs to AI technologies like Copilot?
It’s really difficult to say for sure. History is filled with countless examples of people losing jobs as a result of technological advancement. One of the first examples that comes to mind is the self-checkout kiosks that have largely replaced cashiers in retail establishments. Another example is how video streaming services put video rental stores out of business. In spite of these and so many other examples, however, the opposite can sometimes happen.
Back in the early 2000s, a well-known tech company met privately with tech journalists (myself included) to warn us that tech journalists and IT pros would soon be out of a job as a result of technological advancement. The reasoning behind that warning was that everything was going to move to the cloud and the IT pros who had always been responsible for overseeing various IT systems would no longer be needed. In retrospect, the opposite ended up happening. Hybrid cloud and multicloud environments caused IT systems to become even more complicated than ever before, solidifying the need for IT pros.
In the case of Copilot and other AI interfaces, it is entirely possible that IT pros might eventually be rendered obsolete, but not just yet. I have tried using AI to write PowerShell scripts, for example, but have had mixed success with AI generated scripts. The scripts produced by AI often serve as a good starting point, but require some debugging before they work properly (I’m not referring specifically to Copilot).
I also have concerns about the role of tech journalists such as myself and front-end IT pros such as those who work at a help desk. After all, users will no longer need to ask an IT pro or consult an article to learn how to perform a task. They will simply be able to ask Copilot to do it for them, and won’t actually need to know how to manually perform the task. [EDITOR'S ASIDE: ChatGPT and other LLMs need articles like those we produce to give this detailed how-to information; they don't gain knowledge from a vacuum. We think there is an interesting potential Catch-22 coming up -- if sites stop producing how-to, what will LLMs train themselves on going forward? For low-level how-to, help documentation might suffice, but it's never been sufficient for mid-to-higher level tasks or sites like Redmondmag.com wouldn't exist. Right now we think there's still room for high-level how-to, but we're also watching to see how these tools affect things like our Google traffic, for example. –B.N.]
All of this is a long way of saying that while Copilot does have a very real potential for causing job losses for IT pros, I am optimistic that the role of the IT pro will simply evolve rather than completely going away.
Does Copilot bring with it any compliance or security risks?
Any time that a new feature is introduced, cyber criminals immediately go to work trying to identify security weaknesses and create exploits for those weaknesses. I’m sure that Copilot will be no exception. Vulnerabilities are bound to be discovered and Microsoft will create security patches to address those vulnerabilities.
Having said that, I do think that there is at least some potential for Copilot to be exploited by cyber criminals in a completely different way. My guess is that cyber criminals will use Copilot to help them to create phishing attacks that are more convincing. A criminal might, for instance, ask Copilot to create an official-looking email template or to make a phishing message sound more professional. Of course, it remains to be seen whether or not Copilot will actually honor such requests.
AND THERE'S MORE! Huge thank you to Brien for answering our questions in this first round! Now we want to know: What questions do you have for Brien on this topic? Type them in the comment box below or email them to [email protected] with the subject line "question for Brien." We're going to either do a second Q&A here or a live/prerecorded chat event to get your questions answered about Copilot or anything related to AI and IT. We look forward to hearing from you!