Anthropic is changing the default privacy setting for its Claude consumer chatbot, saying it will begin using new chat transcripts and coding sessions to train future versions of its artificial-intelligence models unless users opt out by 28 September 2025. A pop-up window will ask existing customers to accept updated terms and toggle off data sharing if they do not want their interactions included; new users must make the same choice during sign-up. The policy covers all consumer subscription tiers—Claude Free, Pro and the $100-per-month Max plan—but excludes commercial offerings such as Claude Gov, Claude for Work, Claude for Education and API access through partners like Amazon Bedrock and Google Cloud’s Vertex AI. Anthropic said previously stored conversations will not be ingested unless a user reopens them, and added that the company employs automated tools to filter or mask sensitive information while pledging not to sell user data to third parties. Alongside the policy shift, Anthropic has begun a limited research preview of a Claude browser extension for Google Chrome, granting 1,000 Max-tier subscribers the ability to instruct the AI assistant to perform tasks on web pages. The company said it will expand access gradually as it studies potential security and privacy risks.
Anthropic says it will train its AI models on user data, including new chat transcripts and coding sessions, unless users opt out by September 28 "The updates apply to all of Claude’s consumer subscription tiers, including Claude Free, Pro, and Max, “including when they use https://t.co/osPS0yIdlR
According to The Verge, Anthropic will start training its AI models on user’s data who didn’t explicitly opt-out. Following the trend 👀 https://t.co/6pkrJmG3d4 https://t.co/spKnoHsXLS
NEW: Anthropic will start training its AI models on user data, including new chat transcripts & coding sessions, unless users choose to opt out by 9/28 (it's a pop-up window that will give you the choice). It’s also extending its data retention to 5 years. https://t.co/DCYvFh0Iqu