Meta launches parent-managed WhatsApp accounts for children under 13

Meta has introduced parent-managed accounts on WhatsApp, allowing children under the age of 13 to use the messaging app more safely under supervision. These accounts include controls to limit contacts and restrict certain features. The rollout will begin gradually in the coming months.

Meta announced the introduction of parent-managed accounts on WhatsApp, aimed at enabling safer use for young people under 13. Parents or guardians can link their phone to the child's device by placing them next to each other, granting control over who can contact the child and which groups they can join.

The accounts are limited to messaging and calling, excluding features such as Channels, location sharing, and Meta AI integration. By default, only saved contacts can send messages to the managed account. Message requests from unknown contacts appear first to the parent, who must approve group joins or invites from strangers. These notifications ensure parental oversight, and the accounts are protected by a PIN, with privacy settings adjustable only by the managing adult from their device.

All conversations on these accounts benefit from WhatsApp's end-to-end encryption, maintaining privacy from external access. Meta did not specify a minimum age for these accounts but plans a gradual rollout over the coming months. Step-by-step setup instructions are available for users.

This move builds on Meta's recent efforts to enhance parental controls across its platforms. In September, the company launched teen accounts for users aged 13 to 15 on Facebook and Messenger, allowing parents to review requests and apply stricter privacy options. A year prior, under-16 accounts became mandatory on Instagram with similar safeguards. Earlier in 2026, Meta temporarily halted teen interactions with its AI chatbot characters after reports of inappropriate conversations with minors.

These developments reflect Meta's ongoing focus on child safety amid growing scrutiny of social media's impact on young users.

Related Articles

Illustration of Discord users facing mandatory teen settings and age verification prompts amid privacy backlash.
Image generated by AI

Discord defaults all users to teen settings with age verification

Reported by AI Image generated by AI

Discord announced it will default all accounts to a teen-appropriate experience starting in early March, requiring age verification to access adult content and restricted servers. The move aims to enhance child safety but has sparked backlash over privacy concerns following a recent data breach. Verification options include on-device facial estimation or submitting government IDs.

Meta, the parent company of the three platforms, has announced plans to trial premium subscription services that may charge users. This could reshape social media interactions. Free access will remain, with added features for subscribers.

Reported by AI

Meta Platforms' Japanese arm has announced that Instagram will add a new feature in Japan this year, notifying parents if children aged 13-17 repeatedly search for suicide or self-harm content on the app. This requires parents to link their accounts to their child's. Additionally, it will soon introduce restrictions on access to posts about drugs and dangerous behavior.

Meta will discontinue end-to-end encrypted messaging in Instagram direct messages after May 8, 2026, due to low adoption. Affected users will receive in-app instructions to download their messages and media. The company directs users to WhatsApp for continued encrypted messaging.

Reported by AI

Several countries have implemented or debated measures to limit children's and teenagers' access to social media, citing impacts on mental health and privacy. In Argentina, experts emphasize the need for digital education and structural regulations beyond simple bans. The issue involves not only child protection but also the platforms' data-based business model.

A jury in New Mexico ruled Meta liable for violating the state's consumer protection laws, ordering the company to pay a $375 million penalty. The verdict stems from allegations that Meta misled users about platform safety amid child exploitation risks. Meta plans to appeal the decision.

Reported by AI

Discord announced new default settings on February 9 to enhance age-appropriate experiences, set to implement in March. The 'teen-by-default' policy requires age verification for accessing sensitive content and features. Users have expressed concerns over privacy and potential data breaches.

 

 

 

This website uses cookies

We use cookies for analytics to improve our site. Read our privacy policy for more information.
Decline