Reading Copilot’s terms of use, one gets the impression that Microsoft doesn’t really believe in its own AI. However, the company has an explanation.
The majority of Windows 11 users are fed up with the Copilot AI. Microsoft has understood this and promises to take the necessary steps to remove it from several parts of the operating system. Even though the example of Notepad makes us think it’s just for show.
Meanwhile, Copilot is back in the spotlight after internet users delved into its terms of use. The ones we accept without ever reading. Here, it’s worth the detour as some passages are perplexing.
Microsoft Doesn’t Trust Its Own Copilot AI
Below, we have isolated the most important part. It reads: “Copilot is only for entertainment. It may make mistakes and not work as expected. Do not rely on Copilot for important advice. Using Copilot is at your own risk.” Reassuring.
A bit further down, Microsoft warns: “we cannot guarantee that Copilot’s responses will not infringe on the rights of others (such as copyright, trademarks, or the right to privacy) or damage their reputation.”
An official admission that the Redmond firm does not believe in its product? A spokesperson responds that it is not. “The mention of ‘for entertainment purposes’ is a vestige from the time when Copilot was first launched as a search aid service in Bing.”
Just a missed update then, even though it is quite ironic to realize this when Copilot is widely rejected. The representative clarifies that the text will be modified soon to better reflect the current use of the AI.

/2025/05/21/romeria-1-682dabce9083a510596797.webp)


