FutureFive New Zealand - Consumer technology news & reviews from the future
Story image

Meta enhances safety with new features for teen accounts

Wed, 9th Apr 2025

Meta, the parent company of Instagram and Facebook, has announced an expansion of its Teen Accounts to provide enhanced safety and privacy features for users under 16. This move intensifies the existing protections introduced last September and broadens the scope of parental controls to Facebook and Messenger. The company has introduced several key updates aimed at safeguarding young users on its platforms.

Part of the changes include restricting Instagram live streaming capabilities for users under 16, who will now require parental permission to broadcast live. Additionally, any images that are suspected to contain nudity within direct messages (DMs) will be automatically blurred for these younger users, a setting that cannot be altered without parental consent. These features form a core part of the Teen Accounts system, which by default places young users into a private account setting, limits their messaging capabilities, and applies stringent sensitive content restrictions.

Meta's updates come as a response to growing concerns about children's safety online and align with regulatory expectations, such as those outlined in the UK's Online Safety Act. This legislation requires tech giants to enhance measures to prevent exposure to illegal or harmful content among users, particularly minors. Terry Green, a Social Media partner at the law firm Katten Muchin Rosenman UK LLP, noted that these changes by Meta reflect a proactive approach to comply with existing and forthcoming regulations.

Green highlighted the significance of the Online Safety Act and its comprehensive coverage, which extends to various online harms and protections. The act mandates a rigorous approach, compelling all platforms to address risks through assessments and compliance with upcoming guidance. As a part of these efforts, platforms are expected to carry out illegal harms risk assessments and assess risks related to children's access to ensure a safer online environment.

Meta, in its announcements, emphasized the role of Teen Accounts in supporting parents' concerns regarding their children's online safety. The company revealed that since the initial introduction of these accounts, around 54 million teenagers have been transitioned to Teen Accounts globally, and further expansion is anticipated. The tech giant stated that the feedback from both teens and their parents has been overwhelmingly positive, with a recent survey indicating that 90% of parents found the new features supportive.

Despite the positive reception of these safety features, Meta has faced criticism for some of its broader decisions regarding content moderation. Earlier in the year, Meta's CEO Mark Zuckerberg mentioned plans to phase out third-party fact-checkers in favour of user-generated community notes to encourage free expression. This decision has drawn criticism from online safety experts and campaigners, who warn that it may lead to increased exposure to harmful content among younger users. Zuckerberg himself acknowledged that these changes might result in catching "less bad stuff."

As Meta rolls out these Teen Account changes globally, it is evident that the tech giant is considering regulatory pressures and public sentiment about online safety. By enhancing parental controls and aligning more closely with legislative expectations, Meta aims to provide a safer digital environment for young users across its popular platforms.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X