Microsoft spent years pushing Copilot, but now it says don’t rely on it
Key Points:
- Microsoft has integrated its AI assistant, Copilot, deeply into core productivity tools like Windows, Office, and Teams, promoting it as the future of work and essential for serious tasks.
- However, Microsoft’s recent Terms of Use now state that Copilot is intended for "entertainment purposes only" and should not be relied upon for important decisions such as financial, legal, or medical advice.
- This disclaimer serves as a legal safeguard against liability due to AI errors, but it conflicts with Copilot’s prominent role in professional environments where users depend on its output for critical work.
- The internet and users have expressed confusion and skepticism, questioning why a tool embedded in serious work applications is simultaneously labeled as non-serious and difficult to disable.
- While similar disclaimers exist for other AI tools, Copilot’s mandatory integration across Microsoft’s platforms makes this contradiction particularly striking and has led to criticism of Microsoft’s messaging and approach.