Google has begun a wide rollout of its Sensitive Content Warnings tool for the Messages app on Android, 10 months after first announcing the safeguard. The opt-in feature detects images containing nudity and automatically blurs them before a user chooses whether to view, delete or block the sender. Image classification is handled locally on the device via Android System SafetyCore, and Google says no identifiable data or flagged content is transmitted to its servers. The warnings also appear when users attempt to send or forward explicit images, requiring an explicit confirmation before the message is delivered. Sensitive Content Warnings are switched on by default for supervised accounts and for teens aged 13–17, while adult users must enable the setting manually. A Google Account sign-in is required for the system to function.