Microsoft Copilot’s terms of use currently claims that the artificial intelligence model is “for entertainment purposes only”.
Microsoft Copilot’s terms of use claims the artificial intelligence (AI) model is “for entertainment purposes only”.
The description – which was last updated on October 24, 2025 – also noted that “it can make mistakes, and it may not work as intended” and urged users to not “rely on Copilot for important advice. Use Copilot at your own risk”.
The agreement added: “We do not make any warranty or representation of any kind about Copilot.
“For example, we can’t promise that any Copilot’s Responses won’t infringe someone else’s rights (like their copyrights, trademarks, or rights of privacy) or defame them.
“You are solely responsible if you choose to publish or share Copilot’s Responses publicly or with any other person.”
A Microsoft spokesperson told PC Mag that it will be updating the disclaimer’s “legacy language” soon.
They said: “The ‘entertainment purposes’ phrasing is legacy language from when Copilot originally launched as a search companion service in Bing.
“As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update.”
According to Tom’s Hardware, other AI providers are using a similar disclaimer to Microsoft.
xAI notes that “Artificial intelligence is rapidly evolving and is probabilistic in nature; therefore, it may sometimes: a) result in Output that contains ‘hallucinations,’ b) be offensive, c) not accurately reflect real people, places or facts, or d) be objectionable, inappropriate, or otherwise not suitable for your intended purpose”.
Microsoft to change Copilot ‘for entertainment purposes only’ claim in terms of use







