Close Menu
Owen Daily

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Anthony Geary, “General Hospital'' actor, dies at 78 – National

    VC explains why most consumer AI startups still lack staying power

    Former Hong Kong media mogul Jimmy Lai hears verdict in national security case

    Trending
    • Anthony Geary, “General Hospital'' actor, dies at 78 – National
    • VC explains why most consumer AI startups still lack staying power
    • Former Hong Kong media mogul Jimmy Lai hears verdict in national security case
    • Grammy-nominated opera singer stabbed to death at home, son arrested – National
    • Grok misunderstood key facts about the Bondi Beach shooting.
    • The end of the “Berkshire Way”? Combs' departure is not the only major change as Buffett approaches the transition.
    • Director Karl Rinsch found guilty of defrauding Netflix of $11 million – National
    • DoorDash driver faces felony charge after allegedly spraying customer's food
    Tuesday, December 16
    Owen Daily
    • Health
    • Latest News
    • Real Estate
    • Technology
    • Entertainment
    Owen Daily
    You are at:Home»Technology»Artificial users face new choices – opt out or share your data for AI training
    Technology

    Artificial users face new choices – opt out or share your data for AI training

    August 28, 202504 Mins Read
    Artificial users face new choices opt out or share

    Humanity is making major changes to the way user data is processed, so all Claude users will need to decide by September 28th whether they want to use it to train their AI models. When asked what prompted the move, the company led us to a blog post about policy changes, but we formed some of our own theories.

    But first, what's changing: Previously, humanity didn't use consumer chat data for model training. Currently, the company wants to train AI systems in user conversations and coding sessions, saying data retention has been extended to five years for those who don't opt ​​out.

    It's a massive update. Previously, users of human consumer products were told that prompts and conversation outputs were automatically removed from human backend within 30 days “unless a legal or policy is kept longer as necessary,” or that input and output of users may be reduced for up to two years, if that input is sometimes reduced for up to two years.

    Consumer means that the new policy will be applied to Claude Free, Pro, and Max users, including those using Claude Code. Business customers using Claude Gov, Work's Claude, Claude for Education, or API access will not be affected.

    So why is this happening? That post on the update states that Human Frames show changes around user choices and not opting out makes it a system that will improve the safety of the model, detect harmful content more accurately, and reduce the likelihood of flagging harmless conversations. Users say, “We will help future Claude models improve with skills such as coding, analysis, and inference, and ultimately lead to better models for all users.”

    In short, help us to help you. But the complete truth is probably a bit selfless.

    Like all other large language modeling companies, humanity needs more data than it needs to have fuzzy feelings about brands. Training AI models requires a huge amount of high quality conversational data, and accessing millions of Claude interactions should provide accurately the kind of real-world content that can improve humanity's competitive positioning against rivals such as Openai and Google.

    TechCrunch Events

    San Francisco
    |
    October 27th-29th, 2025

    Beyond the competitive pressures of AI development, this change will appear to reflect a wider industry shift in data policy as it faces humanity and companies like Openai on data retention practices. Openai, for example, is currently fighting a court order that forces all consumer ChatGPT conversations to be held indefinitely, including deleted chats, due to lawsuits filed by the New York Times and other publishers.

    In June, Openai COO Brad Lightcap called it a “defensible and unnecessary demand” of “a fundamental conflict with privacy commitment to users.” While court orders affect ChatGpt Free, Plus, Pro, and team users, customers with zero data retention agreements are still protected.

    What's surprising is the amount of confusion that all of these changing usage policies are creating for users, and many of them remain unforgettable.

    To be fair, everything is moving rapidly, so as technology changes, our privacy policy will change. However, many of these changes are rather drastic and have been mentioned briefly in other corporate news. (You wouldn't think that Tuesday's policy changes for human users are very big news based on where the company placed this update on its press page.)

    However, many users are not aware that the agreed guidelines have been changed as the design actually guarantees it. Most ChatGPT users will continue to click on the “Delete” toggle that is not technically deleted. On the other hand, the implementation of new policies for humanity follows familiar patterns.

    Why? New users will choose their preferences while signing up, but existing users face pop-ups with “Consumer Terminology and Policy Update” in large text, with a prominent black “Accept” button.
    As observed today in The Verge, design raises concerns that users may click quickly to “accept” without realizing that they have agreed to data sharing.

    On the other hand, the interests for user perceptions do not increase. Privacy experts have long warned that the complexity surrounding AI makes meaningful user consent almost impossible. Under the Biden administration, the Federal Trade Commission has stepped in to warn that enforcement measures are at stake if AI companies engage in “secretly changing their terms of service or privacy policies, or filling in legal printed disclosures”;

    Whether the committee is currently operating with three of the five commissioners, whether we still look to these practices today is an open question we place directly at the FTC.

    artificial choices data face opt share training users
    Share. Facebook Twitter Email
    Previous ArticleLake Powell's total capacity is shrinking, the report shows
    Next Article Norman Reedus' son Mingus faces charges after assault arrest – National

    Related Posts

    VC explains why most consumer AI startups still lack staying power

    December 16, 2025

    Grok misunderstood key facts about the Bondi Beach shooting.

    December 15, 2025

    DoorDash driver faces felony charge after allegedly spraying customer's food

    December 14, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Tiktok now allows users to send voice memos and images via DMS

    August 29, 2025

    Review Week: Meta reveals Oakley Smart Glasses

    June 21, 2025

    Here are our biggest takeaways from the 24-hour “Vibe Coding” hackathon

    October 23, 2025

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    About us
    About us

    Owen Daily is a dynamic digital platform dedicated to delivering timely and insightful news across a spectrum of topics, including world affairs, business, politics, technology, health, and entertainment. Our mission is to bridge the gap between global developments and local perspectives, providing our readers with a comprehensive understanding of the events shaping our world.​

    Most Popular

    Tiktok now allows users to send voice memos and images via DMS

    August 29, 2025

    Review Week: Meta reveals Oakley Smart Glasses

    June 21, 2025

    Here are our biggest takeaways from the 24-hour “Vibe Coding” hackathon

    October 23, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    © 2025 Owen Daily. All Rights Reserved.
    • About Us
    • Contact us
    • Privacy Policy
    • Terms and Conditions
    • Disclaimer

    Type above and press Enter to search. Press Esc to cancel.