Microsoft fixes confusing AI rules after users call it out
Redmond, WA, USATue Apr 07 2026
Microsoft just changed how it talks about its Copilot AI after people noticed a strange phrase in the rules. The company used to say Copilot is "for entertainment purposes only, " which sounded like it was just a fun toy. But Microsoft actually sells Copilot as a serious tool for work. Now, after users pointed out the mismatch, the company admits the old wording was outdated.
The problem started because Copilot began as a simple search helper in Bing. Back then, the phrase made sense, but now it’s used for real tasks. Microsoft says it will update the rules to match how people actually use Copilot. Still, the old terms stay in the agreement for now, along with other long legal documents users must agree to.
Other companies take a different approach. OpenAI, Meta, and others warn users that AI can make mistakes and shouldn’t be trusted for important decisions. Some even make users take full responsibility for problems that come from using their AI. Microsoft’s wording stands out because it used to sound like it wasn’t serious about Copilot’s real-world use.
Legal fights over AI mistakes are already happening. Some people have sued OpenAI, saying its chatbot gave bad advice that hurt them. One case even led to a death. These lawsuits show why clear rules matter—users need to know when an AI is reliable and when it’s just guessing.
https://localnews.ai/article/microsoft-fixes-confusing-ai-rules-after-users-call-it-out-54a9cc38
actions
flag content