
Ofcom, the UK's media regulator, has proposed a new code of practice under the Online Safety Act to enhance child protection online. The code mandates social media platforms like TikTok, Instagram, and Facebook to implement robust age verification processes, including photo ID checks, and introduces 40 other safety checks. Companies failing to comply could face fines amounting to billions of pounds. The proposed measures also require tech firms to modify their algorithms to prevent the recommendation of harmful content to children. Ofcom has published its draft Children's Safety Codes of Practice outlining these new standards.
The UK media regulator has set out new rules for social media companies designed to keep children safe online. 🖥️ On the #Daily @skynewsniall is joined by the FT's @CristinaCriddle & @JohnC1912 who is on the govt advisory body for online safety ⬇️ 🎙️ https://t.co/IJafU5jPf9
Ofcom published its draft Children's Safety Codes of Practice laying out the new standards it will expect tech giants to follow to protect children under the Online Safety Act. Science Sec @michelledonelan explains the new measures ⬇️ #PoliticsHub 🔗 https://t.co/GlTNastFii https://t.co/ity03nLd4I
Ofcom proposes new rules requiring tech companies to change their algorithms to hide "toxic" material from children, have more robust age checks, and more (BBC) https://t.co/Rr3cbKfufu 📫 Subscribe: https://t.co/OyWeKSRpIM https://t.co/kSHvsB5fIF




















