Meta launches parent-managed WhatsApp accounts for children under 13

Meta has introduced parent-managed accounts on WhatsApp, allowing children under the age of 13 to use the messaging app more safely under supervision. These accounts include controls to limit contacts and restrict certain features. The rollout will begin gradually in the coming months.

Meta announced the introduction of parent-managed accounts on WhatsApp, aimed at enabling safer use for young people under 13. Parents or guardians can link their phone to the child's device by placing them next to each other, granting control over who can contact the child and which groups they can join.

The accounts are limited to messaging and calling, excluding features such as Channels, location sharing, and Meta AI integration. By default, only saved contacts can send messages to the managed account. Message requests from unknown contacts appear first to the parent, who must approve group joins or invites from strangers. These notifications ensure parental oversight, and the accounts are protected by a PIN, with privacy settings adjustable only by the managing adult from their device.

All conversations on these accounts benefit from WhatsApp's end-to-end encryption, maintaining privacy from external access. Meta did not specify a minimum age for these accounts but plans a gradual rollout over the coming months. Step-by-step setup instructions are available for users.

This move builds on Meta's recent efforts to enhance parental controls across its platforms. In September, the company launched teen accounts for users aged 13 to 15 on Facebook and Messenger, allowing parents to review requests and apply stricter privacy options. A year prior, under-16 accounts became mandatory on Instagram with similar safeguards. Earlier in 2026, Meta temporarily halted teen interactions with its AI chatbot characters after reports of inappropriate conversations with minors.

These developments reflect Meta's ongoing focus on child safety amid growing scrutiny of social media's impact on young users.

相关文章

Illustration of Discord users facing mandatory teen settings and age verification prompts amid privacy backlash.
AI 生成的图像

Discord defaults all users to teen settings with age verification

由 AI 报道 AI 生成的图像

Discord announced it will default all accounts to a teen-appropriate experience starting in early March, requiring age verification to access adult content and restricted servers. The move aims to enhance child safety but has sparked backlash over privacy concerns following a recent data breach. Verification options include on-device facial estimation or submitting government IDs.

Meta, the parent company of the three platforms, has announced plans to trial premium subscription services that may charge users. This could reshape social media interactions. Free access will remain, with added features for subscribers.

由 AI 报道

Meta Platforms 的日本分公司宣布,Instagram 将于今年在日本推出新功能,如果 13 至 17 岁儿童在应用上反复搜索自杀或自残内容,将通知父母。该功能要求父母将账户链接至孩子的账户。此外,即将引入对毒品和危险行为相关帖子的访问限制。

Several countries have implemented or debated measures to limit children's and teenagers' access to social media, citing impacts on mental health and privacy. In Argentina, experts emphasize the need for digital education and structural regulations beyond simple bans. The issue involves not only child protection but also the platforms' data-based business model.

由 AI 报道

根据与信息和通信技术部的新协议,Meta 承诺加强其在 Facebook 上检测、报告和移除虚假信息及不当内容的机制。这包括更快标记儿童剥削材料、立即向地方当局报告并从平台移除。该协议还针对使用官员、企业领袖和名人深度伪造的诈骗行为,例如虚假投资计划。

The SPD has proposed a ban on social media platforms for children under 14 in an impulse paper. The plan includes age verification via the EU app EUDI-Wallet and tiered rules by age group. It draws inspiration from Australia's recent model.

由 AI 报道

Discord 于 2 月 9 日宣布新默认设置,以提升适合年龄的体验,将于 3 月实施。“默认青少年”政策要求进行年龄验证才能访问敏感内容和功能。用户对隐私和潜在数据泄露表示担忧。

 

 

 

此网站使用 cookie

我们使用 cookie 进行分析以改进我们的网站。阅读我们的 隐私政策 以获取更多信息。
拒绝