• 'Copilot is for entertainment purposes only': Even Microsoft's of

    From TechnologyDaily@1337:1/100 to All on Friday, April 03, 2026 17:15:28
    'Copilot is for entertainment purposes only': Even Microsoft's official terms and conditions say you really shouldn't be using its AI at work

    Date:
    Fri, 03 Apr 2026 16:05:00 +0000

    Description:
    Microsoft notes Copilot is for "entertainment purposes only" and to be used "at your own risk," but nothing has changed to its availability.

    FULL STORY ======================================================================Copy link Facebook X Whatsapp Reddit Pinterest Flipboard Threads Email Share this article 0 Join the conversation Follow us Add us as a preferred source on Google Newsletter Tech Radar Pro Are you a pro? Subscribe to our newsletter Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed! Become a Member in Seconds Unlock instant access to exclusive member features. Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors By submitting your information you agree to the Terms & Conditions and Privacy Policy and are aged 16 or over. You are
    now subscribed Your newsletter sign-up was successful Join the club Get full access to premium articles, exclusive features and a growing list of member rewards. Explore An account already exists for this email address, please log in. Subscribe to our newsletter Microsoft has clarified some of the terms and conditions associated with Copilot Responsibilities have been shifted onto
    the users for the AI tool Despite being for "entertainment purposes," it's still heavily marketed toward workers In a major twist of events, Microsoft has re-affirmed Copilot is for "entertainment purposes only" and that, if
    used for work, it should be used as the first of multiple stages of fact-checking, rather than being relied upon.

    "It can make mistakes, and it may not work as intended," the company wrote. "Dont rely on Copilot for important advice. Use Copilot at your own risk." Though the company very much wants businesses and employees to continue using Copilot for work, there's a clear shift in responsibility to the user here, clearing Microsoft of any accusations of false information. Article continues below You may like 'The era of Copilot execution is here': Microsoft's
    Copilot Cowork is here with Anthropic AI to conquer all your biggest work tasks Microsoft hands 'Microslop' pushers more ammo with new Copilot AI blunders Microsoft 365 confirms new premium tier focused on AI and productivity Microsoft says "use Copilot at your own risk" In a roundabout way, Microsoft is effectively admitting to the risk of AI hallucination amid ongoing concerns about copyrighted content, IP ambiguity and output legitimacy.

    With this in mind, the company clearly wants us to think of Copilot as a
    tool, not a decision-maker, and for users to independently fact-check outputs and be cautious with any sensitive, protected data.

    "You agree to indemnify us and hold us harmless... from and against any claims, losses, and expenses... arising from or relating to your use of Copilot," Microsoft added in another paragraph.

    More broadly, the company also notes that prompts and responses may be used
    to improve Copilot, however enterprise versions have additional protections
    to safeguard sensitive information. In other words, users retain the rights
    to their inputs, however Microsoft still has the right to use the data for improving the service. Are you a pro? Subscribe to our newsletter Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed! Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners
    or sponsors By submitting your information you agree to the Terms &
    Conditions and Privacy Policy and are aged 16 or over.

    However, while Microsoft's efforts to push some responsibility onto the
    users' shoulders has hit the spotlights, it's not the only company with such terms. OpenAI, Google and Anthropic all state similar advisories in their terms, including user responsibility and no guarantee of accuracy.

    The shift in responsibility from AI vendor to user is an ongoing change that companies are asserting as the industry still works out what the legal risks could be, but with Microsoft still selling Copilot tools to business users
    and consumers, it's clearly a term rewording exercise than a total shift in behavior. Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!

    And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.



    ======================================================================
    Link to news story: https://www.techradar.com/pro/copilot-is-for-entertainment-purposes-only-even- microsofts-official-terms-and-conditions-say-you-really-shouldnt-be-using-its- ai-at-work


    --- Mystic BBS v1.12 A49 (Linux/64)
    * Origin: tqwNet Technology News (1337:1/100)