
In a move that underscores Meta’s commitment to improving teen safety on its platforms, the company has announced a series of updates aimed at creating a safer online experience for younger users. With ongoing scrutiny over the impact of social media on teens, these changes reflect Meta’s attempts to address mounting concerns about user wellbeing, particularly in light of lawsuits and calls for stricter regulations.

Key Updates to Instagram, Facebook, and Messenger for Teen Protection
Meta’s latest batch of safety features focuses on giving parents more control and enhancing protections against unwanted interactions and harmful content. As part of the update, Instagram is introducing new restrictions for live streaming and direct messages (DMs), two areas that have raised concerns about teen vulnerability on the platform.
For teens under 16, Instagram will now require parental approval for two significant changes: live streaming and the option to disable the automatic blurring of explicit images in DMs. Specifically, teens will need parental permission to access Instagram Live and will be prohibited from going live without it. Similarly, the default setting to blur explicit images in DMs will remain in place unless parents opt out on their teen’s behalf.
Instagram’s decision to implement these changes reflects a growing awareness of the risks posed by live streaming. With its unfiltered, real-time nature, live streaming leaves young users open to potential exploitation or inappropriate content. By requiring parental oversight, Meta aims to strike a balance between giving teens the freedom to engage on the platform while protecting them from harmful experiences.
Expanding Teen Accounts to Facebook and Messenger
In addition to Instagram updates, Meta is also expanding its teen account protections to Facebook and Messenger. The same safeguards that were introduced on Instagram are now being replicated on these platforms, offering teens a similarly protected environment. These measures include restrictions on inappropriate content, unwanted contact, and tools to help ensure that teens’ time on the platform is spent in a healthy, positive way.
“Teen Accounts on Facebook and Messenger will offer similar, automatic protections to limit inappropriate content and unwanted contact, as well as ways to ensure teens’ time is well spent,” Meta explained. The rollouts will begin in the US, UK, Australia, and Canada, with plans to expand to other regions soon. This expansion aims to bring the same level of safety across Meta’s entire social media ecosystem, giving teens consistent protection across multiple platforms.
With these changes, Meta seeks to provide a safer, more controlled environment for young users, giving them the ability to engage with others while safeguarding against potential threats.

The Impact of These Safety Updates
The significance of these updates becomes even clearer when considering Meta’s track record. Last year, Instagram rolled out a suite of privacy and safety features that included enhanced messaging restrictions, sleep modes to limit screen time, and limits on harmful content exposure. According to Meta, these features have made a notable difference in keeping teens safe on the platform.
“Since making these changes, 97% of teens aged 13-15 have stayed in these built-in restrictions, which we believe offer the most age-appropriate experience for younger teens,” Instagram revealed. This data highlights that, while teens still have the option to opt out of some safety measures, the majority of young users prefer the protection these features provide.
Meta’s Teen Safety Updates: A Positive Step or a Response to Legal Pressure?
While these updates certainly seem like a positive move for teen safety, it’s worth questioning whether Meta is making these changes purely out of a sense of responsibility or if they are driven by external pressure. In recent years, Meta has faced significant criticism over the risks its platforms pose to young users, including issues of addiction, mental health challenges, and exposure to harmful content. The U.S. Surgeon General has even called for “cigarette-like” labels on social media platforms to warn users of potential risks.
Meta’s updates could also be seen as a direct response to the growing legal and regulatory pressures the company is facing. In 2023, a teenager filed a $5 million lawsuit against Meta, accusing the company of using addictive algorithmic features that harm young users. Additionally, in response to these concerns, the U.S. Senate passed the Kids Online Safety Act, which seeks to impose stricter privacy and safety requirements on platforms like Instagram and Facebook.
Meta’s efforts to expand its teen protection features could be an attempt to stem the tide of mounting legal scrutiny. Given the rising evidence of the harms caused by social media, including its links to mental health issues, these updates may represent a step in the right direction, but more will likely be required to fully address the challenges posed by these platforms.

Will These Updates Be Enough to Protect Teens in the Long Run?
The introduction of new teen safety tools is undeniably a positive step, but it’s important to maintain a critical perspective. Meta’s latest actions come at a time when social media platforms are under increasing scrutiny for their role in affecting young people’s mental health. While Meta’s updates show a willingness to address these concerns, they may not go far enough to fully mitigate the potential risks.
In particular, the ongoing push for innovation in AI, virtual reality (VR), and other immersive technologies raises further questions about the long-term impact on teen users. As new technologies evolve, the risks to mental health may grow, and the conversation around how to best protect young users on social media platforms will only become more urgent.
While Meta’s latest efforts to enhance teen safety are commendable, it’s clear that the battle to safeguard young users on social media is far from over. The real challenge will lie in ensuring that these platforms continue to evolve in a way that prioritizes the well-being of their youngest users while balancing the pressures of innovation and market demands.