Meta is rolling out tighter safety rules for teenagers on Facebook and Messenger, expanding a system it first launched on Instagram last year.
These new rules aim to make the platforms safer for users under 18 by limiting certain features and requiring parental approval for others.
The system, called Teen Accounts, will now automatically apply more restricted settings to younger teens, especially those aged 13 to 15. For example, they won’t be able to live stream or disable protection features without getting permission from a parent or guardian. The update is starting this week in the UK, US, Australia, and Canada.
Meta says the idea is to give teens a safer experience by default, while also giving parents more oversight. The company claims it has already moved more than 54 million teen users worldwide into these safer settings since launching the feature on Instagram in September.
Teen Accounts work based on the age users enter when signing up. Older teens aged 16 to 18 can switch off some restrictions if they want, but younger teens must get a parent involved to do the same. Meta is also using video selfies and other tools to check users’ ages and plans to use artificial intelligence to catch those who lie about being older.
However, some experts and campaigners worry that the changes don’t go far enough. They point out that kids can still easily fake their age, and it’s unclear how well these new protections actually work. There are also calls for Meta to prevent harmful content from appearing on its platforms in the first place, rather than just reacting to it.
Safety groups say the company needs to do more to protect teens from risks like online exploitation, sextortion scams, and harmful content. Critics also argue that Meta has not been transparent about how effective the Teen Accounts feature really is.
Despite these concerns, others believe this is a positive step. They say it shows big social media companies are beginning to compete over who can offer the safest experience for teens, rather than just trying to attract the most users.
Meta says teenagers will receive notifications letting them know their account is changing. Soon, even more features like nudity protection in messages and live-streaming restrictions will require parental approval.
As governments like the UK push for stronger online safety laws, including the new Online Safety Act, companies like Meta are under more pressure to build safer environments for young users. But many believe the work is far from over.
