
A Florida man shared a chilling tale of how his parents were almost scammed using AI voice-cloning technology. The scammers cloned his voice and tricked his parents into believing he was in trouble and needed $30,000 for bail. This incident highlights the growing threat of AI-based scams targeting private individuals. A US politician's family also nearly fell victim to a similar scam demanding ₹25 lakh. A cybersecurity researcher from Unit42_Intel noted that audio deepfake attacks, once primarily aimed at businesses, are now increasingly targeting individuals.
In a tragic incident in #Agra a woman died of a heart attack following an extortion call. The caller threatened to leak an obscene video involving her daughter if a payment of Rs 1 lakh was not made. https://t.co/N1nVe325Vc
Agra teacher gets scam call about daughter's sex racket, dies of heart attack https://t.co/j5VHn6u76Z
A government school teacher in #Agra died from a cardiac arrest after receiving a threatening call from cyber fraudsters, her family reported Know more🔗https://t.co/xurw01p7Rj https://t.co/klGIGvrxqr

