Home / Technology / Microsoft Copilot: For Fun or For Real?
Microsoft Copilot: For Fun or For Real?
4 Apr
Summary
- Copilot's terms state it's for entertainment.
- Users question Microsoft's trust in its own AI.
- Similar disclaimers are used by psychic services.

Microsoft's AI assistant, Copilot, is under scrutiny due to its user terms which categorize it as being "for entertainment purposes only." This designation, updated in the fall, contradicts the company's marketing efforts that position Copilot as a valuable tool for both individuals and businesses.
The terms explicitly warn users against relying on Copilot for important advice, stating that it can make mistakes and may not function as intended. Furthermore, Microsoft disclaims any responsibility for Copilot's responses infringing on rights like copyright or privacy, or for defaming individuals.
This 'entertainment only' clause has drawn criticism on social media, with users questioning why they should trust a product that even its creators seem hesitant to endorse. The phrasing mirrors disclaimers found on psychic service websites, intended to limit legal liability.
These terms have circulated widely, fueling debate about Microsoft's commitment to its AI technology. The company has previously invested billions in OpenAI and has faced AI-related lawsuits, adding to the complexity of this situation.