Worried about what your kids might be up to on social media? If so, Meta’s continued crackdown on teen safety might come as a relief. The company announced Tuesday that, starting immediately, it’s expanding its Instagram Teen Accounts to other platforms, specifically, Facebook and Messenger.
It also announced additional built-in protections for Instagram Teen Accounts. These will prevent children under the age of 16 from going live on the platform or turning off blurred images, which protect against suspected nudity in direct messages, without parental permission.
Meta first launched Instagram Teen Accounts back in September 2024, in a bid to make the platform a safer place for kids and provide more oversight and supervision options for parents. In an update on Tuesday, the company said it had switched 54 million accounts to become Teen Accounts so far, with more to go. The accounts offer built-in protections, including being set to private by default and a hidden words feature, which will automatically filter out problematic comments and DM requests.
With parental agreement, some of these features can be switched off, but Meta said that so far 97% of teens aged between 13 and 15 had kept the default safeguards in place. In a Meta-commissioned survey undertaken by Ipsos, the company said that 94% of parents found the protections helpful, with 85% saying it made it easier to have positive experiences on Instagram. The company didn’t say how many parents it surveyed, or where they were situated.
Children’s safety campaigners have been asking social media companies for years to make their platforms safer for kids, and while progress has been slow, Meta’s recognition that teens need different protections than adults to the extent that they require a different kind of account has been an important breakthrough. Other platforms have followed suit, with TikTok introducing new parental controls last month.
Read the full article here