Americas

  • United States

Asia

mfinnegan
Senior Reporter

M365 Copilot, Microsoft’s generative AI tool, explained

feature
Nov 01, 202318 mins
Artificial IntelligenceGenerative AIMicrosoft

Microsoft’s generative AI assistant is integrated within a host of the company's workplace apps. Here’s what enterprises need to know and what they can do to prepare for Microsoft 365 Copilot’s general release.

Microsoft Copilot

Following trials with hundreds of customers in recent months, Microsoft 365 Copilot becomes generally available for larger customers on Nov. 1.

Since announcing a partnership with ChatGPT creator OpenAI earlier this year, Microsoft has been deploying the Copilot generative AI (genAI) assistant across its suite of Microsoft 365 business productivity and collaboration apps. Word, Outlook, Teams, Excel, PowerPoint, and a range of other applications are connected to the AI assistant, which can automate tasks and create content — potentially saving users time and bolstering productivity. 

The aim of Copilot is to automating tasks such as drafting an email or creating a slideshow. In a blog post announcing the tool, Microsoft CEO Satya Nadella described it as “the next major step in the evolution of how we interact with computing…. With our new copilot for work, we’re giving people more agency and making technology more accessible through the most universal interface — natural language.”

With Microsoft 365 Copilot, Microsoft aims to create a “more usable, functional assistant” for work, said J.P. Gownder, vice president and principal analyst at Forrester’s Future of Work team. “The concept is that you’re the ‘pilot,’ but the Copilot is there to take on tasks that can make life a lot easier.” 

M365 Copilot is “part of a larger movement of generative AI that will clearly change the way that we do computing,” he said, noting how the technology has already been applied to a variety of job functions — from writing content to creating code — since ChatGPT-3.5 launched in late 2022.

A recent Forrester report predicted that 6.9 million US knowledge workers — around 8% of the total — will be using Microsoft 365 Copilot by the end of 2024.

Nadella talked up the effectiveness of the M365 Copilot during a recent earnings call, claiming customers have seen productivity gains in line with that of the GitHub Copilot, the AI assistant aimed at developers that launched two years ago. (For reference, GitHub has previously claimed developers were able to complete a single task 55% quicker thanks to the GitHub Copilot, while acknowledging the challenges in measuring productivity.)

Even priced at $30 per user each month, there’s potential to deliver considerable value to businesses, assuming the Copilot is able to deliver on its promise. Said Gownder: “The key issue is, ‘Does it actually save that time?’ because it’s hard to measure and we don’t really know for sure. But even conservative time savings estimates are pretty generous.”

M365 Copilot is billed as providing employees with access to genAI without the security concerns of consumer tools; Microsoft says its models aren’t trained on customer data, for instance. But deploying the tool represents significant challenges, said Avivah Litan, distinguished vice president analyst at Gartner.

There are two primary business risks, she said: the potential for the Copilot to ‘hallucinate’ and provide inaccurate information to users, and the ability for the Copilot’s language models to access huge swathes of corporate data that’s not locked down properly.

“Information oversharing is one of the biggest issues people are going to face in the next few months, or six months to a year,” said Litan. “That’s where the rubber is going to hit the road on the risk — it’s not so much giving the data to Microsoft or OpenAI or Google, it’s all the exposure internally.”

Locking down sensitive files is not a new challenge for IT teams, but the Copilot makes makes it even easier for data to leak. “You just have to ask a question in English, or any native tongue, so it’s a game changer,” said Litan.

What is Microsoft 365 Copilot?

The M365 Copilot “system” consists of three elements: Microsoft 365 apps such as Word, Excel, and Teams, where users interact with the AI assistant; Microsoft Graph, which includes files, documents, and data across the Microsoft 365 environment; and the OpenAI models that process user prompts: OpenAI’s ChatGPT-3, ChatGPT-4, DALL-E, Codex, and Embeddings.

These models are all hosted on Microsoft’s Azure cloud environment. 

Copilot is just part of Microsoft’s overall generative AI push. There are plans for Copilots tailored to Microsoft’s Dynamics 365 business apps, PowerPlatform, the company’s security suite, and its Windows operating system. Microsoft subsidiary GitHub also developed a GitHub Copilot with OpenAI a couple of years ago, essentially providing an auto-complete tool for coders. 

The key component of Copilot, as with other generative AI tools, is the LLM. These language models are best thought of as a machine-learning network trained through data input/output sets; the model uses a self-supervised or semi-supervised learning methodology. Basically, data is ingested and the LLM spits out a response based on what the algorithm predicts the next word will be. The information in an LLM can be restricted to proprietary corporate data or, as is the case with ChatGPT, can include whatever data it’s fed or scraped directly from the web. 

To help businesses deploy the generative AI tool across their data, Microsoft created the Semantic Index for Copilot, a “sophisticated map of your personal and your company data,” and a “pre-requisite” to adopting Copilot within an organization. Using the index should provide more accurate searches of corporate data, Microsoft said. For example, when a user asks for a “March Sales Report,” the Semantic Index won’t just look for documents that include those specific terms; it will also consider additional context such as which employee usually produces sales reports and which application they likely use. 

How much does Copilot cost — is it worth $30 per user, per month?

With the Nov. 1 launch, Microsoft 365 Copilot is available for enterprise customers with E3 and E5 subscriptions and a minimum investment of 300 seats, and will be released later for customers on M365 Businesses Standard and Business Premium tiers. Its use costs an additional $30 per user each month, though it’s likely large customers will be able to negotiate a discount.

It’s a significant extra expense given that E3 and E5 licenses already cost $36 and $57 per user each month. Part of this due to the cost of the high computing costs of the Copilot incurred by Microsoft, said Raúl Castañón, senior research analyst at 451 Research, a part of S&P Global Market Intelligence.

“Microsoft is likely looking to avoid the challenges faced with GitHub Copilot, which was made generally available in mid-2022 for $10/month and, despite surpassing more than 1.5 million users, reportedly remains unprofitable,” he said.

The pricing strategy also reflects Microsoft’s confidence in the impact that genAI will have on workforce productivity.

Per Forrester’s calculations in the “Build Your Business Case For Microsoft 365 Copilot” report, an employee earning $120,000 annually — roughly $57 per hour — might save four hours a month on various productivity tasks; those four hours would be worth around $230 a month. In that scenario, it would make sense to invest in Copilot for an employee earning even half that amount, and that’s leaving aside less tangible benefits around employee experience when automating mundane tasks.

There are, as the Forrester points out, other costs to consider beyond licensing — employee training, for instance, as employees learn the new technology. Gartner also predicts that enterprise security spending will increase in the region of 10% to 15% in the next couple of years as a result of efforts to secure genAI tools (not just M365 Copilot).

Businesses are likely to take a cautious approach to deploying the Microsoft tool, at least at first. Microsoft expects revenue related to M365 Copilot to “grow gradually over time,” Microsoft CFO Amy Hood said during the company’s Q1 2024 earnings call. On the same call, Nadella noted that Copilot will be subject to the usual “enterprise cycle times in terms of adoption and ramp.”

Even if the pace of adoption is gradual, there appears to be plenty of interest in deploying it. Forrester expects around a third of M365 customers in the US to invest in Copilot in the first year. Companies that do so will provide licenses to around 40% of employees during this period, the firm estimated.

How do you use Copilot?

There are two basic ways users will interact with Copilot. It can be accessed directly within a particular app — to create PowerPoint slides, for example, or an email draft — or via a natural language chatbot accessible in Teams, known as Microsoft 365 Chat. 

Copilot Word draft

Copilot can help a Word user draft a proposal from meeting notes. 

Interactions within apps can take a variety of forms, depending on the application. When Copilot is invoked in a Word document, for example, it can suggest improvements to existing text, or even create a first draft.

To generate a draft, a user can ask Copilot in natural language to create text based on a particular source of information or from a combination of sources. One example: creating a draft proposal based on meeting notes from OneNote and a product road map from another Word doc. Once a draft is created, the user can edit it, adjust the style, or ask the AI tool to redo the whole document. A Copilot sidebar provides space for more interactions with the bot, which also suggests prompts to improve the draft, such as adding images or an FAQ section. 

During a Teams video call, a participant can request a recap of what’s been discussed so far, with Copilot providing a brief overview of conversation points in real time via the Copilot sidebar. It’s also possible to ask the AI assistant for feedback on people’s views during the call, or what questions remain unresolved. Those unable to attend a particular meeting can send the AI assistant in their place to provide a summary of what they missed and action items they need to follow up on. 

In PowerPoint, Copilot can automatically turn a Word document into draft slides that can then be adapted via natural language in the Copilot sidebar. Copilot can also generate suggested speaker notes to go with the slides and add more images. 

The other way to interact with Copilot is via Microsoft 365 Chat, which is accessible as a chatbot with Teams. Here, Microsoft 365 Chat works as a search tool that surfaces information from a range of sources, including documents, calendars, emails, and chats. For instance, an employee could ask for an update on a project, and get a summary of relevant team communications and documents already created, with links to sources.

Microsoft will extend Copilot’s reach into other apps workers use via “plugins” — essentially third-party app integrations. These will allow the assistant to tap into data held in apps from other software vendors including Atlassian, ServiceNow, and Mural. Fifty such plugins are available, with “thousands” more expected eventually, Microsoft said. 

copilot business chat

Copilot can synthesize information about a project from different sources.

How are early customers using Copilot?

Prior to launch, many businesses accessed M365 Copilot as part of a paid early access program (EAP); it began with a small number of participants before growing to several hundred customers, including Chevron, Goodyear, and General Motors. 

One of those involved in the EAP was marketing firm Dentsu, which began deploying 300 licenses to tech staff and then employees across its business lines globally. The most popular use case so far is summarization of information generated in M365 apps — a Teams call being one example.

“Summarization is definitely the most common use case we see right out of the box, because it’s an easy prompt: you don’t really have to do any prompt engineering… it’s suggested by Copilot,” said Kate Slade, director of emerging technology enablement at Dentsu.

Staffers would also access M365 Chat functions to prepare for meetings, for instance, with the ability to quickly pull information from different sources. This could mean finding information from a project several years ago “without having to hunt through a folder maze,” said Slade.

The feedback from workers at Dentsu has been overwhelmingly positive, said Slade, with a waiting list now in place for those who want to use the AI tool.

“It’s reducing the time that they spend on [tasks] and giving them back time to be more creative, more strategic, or just be a human and connect peer to peer in Teams meetings,” she said. “That’s been one of the biggest impacts that we’ve seen…, just helping make time for the higher-level cognitive tasks that people have to do.”

Use cases have varied between different roles. Denstu’s graphic designers would get less value from using Copilot in PowerPoint, for example: “They’re going to create really visually stunning pieces themselves and not really be satisfied with that out-of-the-box capability,” said Slade. “But those same creatives might get a lot of benefits from Copilot in Excel and being able to use natural language to say, ‘Hey, I need to do some analysis on this table,’ or ‘What are key trends from this data?’ or ‘I want to add a column that does this or that.’”

How does Copilot compare with other productivity and collaboration genAI tools?

Most vendors in the productivity and collaboration software market are adding genAI to their offerings, though these are still in early stages.

Google, Microsoft’s main competitor in the productivity software arena, has announced plans to incorporate genAI into its Workspace suite. Duet AI for Workspace, announced in May, is now available to enterprise customers. This can provide Gmail conversation summaries, draft text, and generate images in Docs and Slides, for instance. 

Slack, the collaboration software firm owned by Salesforce and a rival to Microsoft Teams, is also working to introduce LLMs in its software. Other firms that compete with elements of the Microsoft 365 portfolio, such as Zoom, Box, Coda, and Cisco, have also touted genAI plans.

“On the vendor side, many are jumping on the generative AI bandwagon as evidenced from the plethora of announcements in the first half of the year,” said Castañón.  

Copilot appears to have some advantages over rivals. One is Microsoft’s dominant position in the productivity and collaboration software market, said Castañón. “The key advantage the Microsoft 365 Copilot will have is that — like other previous initiatives such as Teams — it has a ‘ready-made’ opportunity with Microsoft’s collaboration and productivity portfolio and its extensive global footprint,” he said. 

Microsoft’s close partnership with OpenAI (Microsoft has invested billions of dollars in the company on several occasions since 2019 and has a large non-controlling share of the business), likely helped it build generative AI across its applications at faster rate than rivals. 

“Its investment in OpenAI has already had an impact, allowing it to accelerate the use of generative AI/LLMs in its products, jumping ahead of Google Cloud and other competitors,” said Castañón. 

What are the genAI risks for businesses? ‘Hallucinations’ and data protection

Along with the potential benefits of genAI tools like M365 Copilot, businesses should consider risks. These include the hallucinations LLMs are prone to, where incorrect information is provided to employees.

“Copilot is generative AI — it definitely can hallucinate,” said Slade, citing the example of one employee who asked the Copilot to provide a summary of pro bono work completed that month to add to their timecard and send to their manager. A detailed two-page summary document was created without issue; however, the address of all meetings was given as “123 Main Street, City, USA” — an error that’s easily noticed, but an indication of the care required by users when relying on Copilot.

The occurrence of hallucinations can be reduced by improving prompts, but Dentsu staff have been advised to treat outputs from the genAI assistant with caution. “The more context you can give it generally, the closer you’re going to get to a final output,” said Slade. “But it’s never going to replace the need for human review and fact check.

“As much as you can, level-set expectations and communicate to your first users that this is still an evolving technology. It’s a first draft, it’s not a final draft — it’s going to hallucinate and mess up sometimes.”

Tools that filter Copilot outputs are emerging that could help here, said Litan, but this is likely to remain a key challenge for businesses for the forseeable future.

Another risk relates to one of the major strengths of the Copilot: its ability to sift through files and data across a company’s M365 environment using natural language inputs.

While Copilot is only able to access files according to permissions granted to individual employees, the reality is that businesses often fail to adequately label sensitive documents. This means individual employees might suddenly realize they are able to ask Copilot to provide details on payroll or customer information if it hasn’t been locked down with the right permissions.

A 2022 report by data security firm Varonis claimed that one in 10 files hosted in SaaS environments is accessible by all staff; an earlier 2019 report put that figure — including cloud and on-prem files and folders — at 22%. In many cases, this can mean organization-wide permissions are granted to thousands of sensitive files, Varonis said.

In many cases, the most important data, around payroll, for instance, will have strict permissions in place. A greater challenge lies in securing unstructured data, with sensitive information finding its way into a wide range of documents created by individual employees — a store manager planning payroll in an Excel spreadsheet before updating a central system, for example. This is similar to a situation that the CTO of an unnamed US restaurant chain encountered during the EAP, said Litan. 

“There’s a lot of personal data that’s kept on spreadsheets belonging to individual managers,” said Litan. “There’s also a lot of intellectual property that’s kept on Word documents in SharePoint or Teams or OneDrive.”

“You don’t realize how much you have access to in the average company,” said Matt Radolec, vice president for incident response and cloud operations at Varonis in a recent conversation with Computerworld. “An assumption you could have is that people generally lock this stuff down: they do not. Things are generally open.”

Another consideration is that employees often end up storing files relating to their personal lives on work laptops.

“Employees use their desktops for personal work, too — most of them don’t have separate laptops,” said Litan. “So you’re going to have to give employees time to get rid of all their personal data. And sometimes you can’t, they can’t just take it off the system that easily because they’re locked down — you can’t put USB drives in [to corporate devices, in some cases].

“So it’s just a lot of processes companies have to go through. I’m on calls with clients every day on the risk. This one really hits them.”

Getting data governance in order is a process that could take businesses more than a year to get sorted, said Litan. “There are no shortcuts. You’ve got to go through the entire organization and set up the permissions properly,” she said.

In Radolec’s view, very few M365 customers have yet adequately addressed the risks around data access within their organization. “I think a lot of them are just planning to do the blocking and tackling after they get started,” he said. “We’ll see to what degree of effectiveness that is [after launch]. We’re right around the corner from seeing how well people will fare with it.”