Since its release for companies of all sizes, Copilot for Microsoft 365 has become the album chrome that all IT leaders want to have. However, we have identified that many companies still have questions about the value it brings and its limitations. In this article, we share the limits of the magic of Copilot.
Copilot for Microsoft 365 is an assistant that does much more than compose emails and create presentations. But while it may seem like magic, it’s not magic yet. Many organizations often ask us what Copilot can and can’t do, and what real value it would bring to their teams. To manage expectations correctly and avoid frustration, here we share with you the limits of Copilot. magic of Copilot:
1. Copilot does not perform actions on its own
Copilot for Microsoft 365 is designed to help people perform specific tasks more efficiently, but it does not operate, analyze or decide in place of the user. For example, it can understand what you want to do and offer suggestions, but it will not perform actions for you. Although it is technically feasible for Copilot to perform actions on its own, Microsoft prevents it from performing actions directly, following its best practices for responsible use of AI that state that the user should always review their responses before sharing them with someone else. For example, if you are composing an important email and get stuck on the structure of the message, Copilot will give you intelligent suggestions in real time to improve your wording and make the message more effective, but you won’t be able to ask it to send it for you.
2. Copilot does not read the user’s mind
This means that Copilot cannot anticipate the user’s needs or intentions without clear and direct communication. This is why prompt engineering training of users is so important so that they know how to provide specific and detailed instructions in order to obtain the desired results.
3. Copilot does not learn from previous conversations
Copilot for Microsoft 365 saves all the conversations you have with each user. That is, if one day you get stuck halfway through asking it to finish refining an email, the next day you can pick up from where you left off. What it doesn’t do, however, is learn from conversations or maintain a global context: every time you interact with Copilot it’s as if it were the first time. For example, if one day you ask it to help you prepare a presentation and the next day you ask it to compose an email to send that presentation, Copilot will not remember which presentation it helped you prepare.
4. Copilot does not make decisions
Due to limitations in the volume of data it can process per query and the type of analysis it can perform, Copilot cannot make objective judgments and decisions because it cannot establish causal relationships. For advanced analysis, it is preferable to use specialized analytical models.
Want to know how to get the most out of Copilot for Microsoft 365 without getting frustrated along the way? Our AI experts have identified a number of real-world use cases that allow you to leverage Copilot’s potential in different business areas.