HyperAIHyperAI

Command Palette

Search for a command to run...

Canadian Parents Back App Store Age Verification for Teens to Boost Online Safety

Canadian parents strongly support requiring app store age verification for teens when downloading apps, according to a new survey conducted by Counsel Public Affairs on behalf of Meta. The poll found that 83% of parents back policies that mandate age verification before app downloads, a view shared by 81% of non-parents. This reflects a growing demand for consistent, industry-wide safety standards to protect teens online. Meta has long prioritized teen safety, and since launching Teen Accounts last year, it has placed hundreds of millions of teens in protected online environments. These settings include automatic privacy protections, limits on who teens can contact, and content filters designed to keep their experience age-appropriate. However, with teens using an average of over 40 apps per week, parents are calling for a unified system that extends beyond Meta’s platforms to all apps. The survey revealed that 90% of respondents believe parents should be ultimately responsible for deciding which apps are suitable for their children. To support this, Meta is proposing a system where users in Canada provide their age at account creation. If under 18, the account would need to be linked to a verified parent or guardian who must approve or deny any app download. Once age is verified, the app store would send a simple age signal—without sharing personal data—to app developers, enabling apps to automatically apply age-appropriate safety settings. This approach would give parents a single, centralized place to manage their child’s online access, reinforcing their role as digital guardians. Canadians also want apps to adjust contact settings, content type, and time spent online based on age. Specifically, 82% agree that apps should limit contact with strangers, 84% support content filtering, and 71% back tools to manage screen time. Meta is also updating its Instagram Teen Accounts to align with PG-13 movie ratings. Starting now, teens under 18 will be automatically placed in a 13+ setting, meaning they’ll see content comparable to what’s appropriate for a PG-13 film. They cannot opt out without parental permission. A stricter “Limited Content” setting will also be available, filtering more content and removing the ability to comment on posts. These changes apply across Instagram—Feed, Stories, Search, and recommendations—and extend to account-level restrictions, preventing teens from following or interacting with accounts that regularly share 18+ content. Rollout began in Canada and will be fully implemented by year-end. Meta is also enhancing teen safety with AI. New parental controls will let parents see and manage how their teens interact with AI characters. These AI characters are designed to avoid age-inappropriate topics like self-harm, suicide, or disordered eating. Teens can only engage with a curated set of AI characters focused on education, sports, and hobbies. Parents can block specific characters, disable one-on-one chats entirely, or view discussion topics. These features will begin rolling out in Canada in early 2026, starting with Instagram. Meta’s data shows that while it has made significant progress with Teen Accounts, parents are looking for a broader, consistent approach. The most effective way to ensure safety, the company says, is through verified age checks and parental consent at the app store level.

Related Links

Canadian Parents Back App Store Age Verification for Teens to Boost Online Safety | Trending Stories | HyperAI