HyperAIHyperAI

Command Palette

Search for a command to run...

Meta Defends Teen Safety Efforts Amid Lawsuits, Highlights Decade of Protective Measures and Scientific Evidence

Meta emphasizes its long-standing commitment to teen safety and well-being, countering recent lawsuits that blame social media for teen mental health challenges. While these legal actions often rely on selective interpretations of internal documents, Meta argues that the broader scientific consensus shows teen mental health is influenced by a wide range of factors—including academic pressure, family dynamics, socio-economic conditions, and school safety—not just social media use. Meta highlights that most teens use platforms like Instagram and Facebook to stay connected with loved ones, express themselves, build communities, and access support. Recognizing parents’ concerns, the company has consistently developed tools and features to help families manage teens’ online experiences responsibly. Since 2017, Meta has introduced a range of safety measures. These include suicide prevention tools, intelligent resource sharing for users searching self-harm content, and restrictions preventing adults from initiating private messages with teens they aren’t connected to. In 2021, the company began setting default private accounts for users under 16 in the U.S. and under 18 in certain other regions, along with prompts encouraging existing teens to make their accounts private. In 2023, Meta launched time management features, allowing teens to set daily usage limits. In 2024, it introduced Instagram Teen Accounts with built-in protections—limiting who can contact teens, restricting content exposure, and enabling parents to set time limits. These features expanded to Facebook and Messenger in 2025. Additional safeguards include enhanced DM safety tools and clearer identity information for teens during chats. In 2025, Meta also introduced content settings for Teen Accounts modeled after 13+ movie ratings. By default, teens see content appropriate for that age group, and they cannot opt out without parental permission. Parents can also choose a stricter setting if desired. The company further strengthened school safety by creating a prioritized system for educators and administrators to report teen safety concerns. As AI tools become more prevalent, Meta has implemented protections in its AI products, ensuring safe responses to prompts related to self-harm, suicide, and disordered eating. New parental controls will allow parents to monitor how their teens interact with AI. Parents today have robust supervision tools, including the ability to set daily time limits as low as 15 minutes, block app usage during specific hours, and view their teen’s messaging activity. Meta also works with law enforcement, experts, and industry partners to address threats and deliver educational resources that empower teens and families. Critically, Meta points to growing evidence that teen mental health trends are improving despite continued or increased social media use. According to the U.S. Department of Health and Human Services’ National Survey of Drug Use and Health, major depressive episodes among teens dropped from 21% in 2021 to 15% in 2024. Serious suicidal thoughts declined from nearly 13% to 10% over the same period. The National Academies of Sciences, Engineering, and Medicine has also acknowledged the positive role social media can play in fostering belonging, enabling creative expression, and opening opportunities for teens—such as building audiences for art, sports, or entrepreneurship. Meta maintains that while social media may present risks, it also offers meaningful benefits. The company argues that focusing solely on negative narratives ignores the complexity of teen well-being and distracts from addressing deeper, systemic issues. Meta stresses that its safety decisions—like default private accounts and parental controls—have often come at the cost of user engagement and growth. Yet the company has implemented them anyway, believing they are the right choice for teens and families. In court, Meta will defend against claims that misrepresent its actions. The company remains committed to continuous improvement, prioritizing teen safety and empowering parents with tools and transparency.

Related Links