Meta launches parent-managed WhatsApp accounts for children under 13

Meta has introduced parent-managed accounts on WhatsApp, allowing children under the age of 13 to use the messaging app more safely under supervision. These accounts include controls to limit contacts and restrict certain features. The rollout will begin gradually in the coming months.

Meta announced the introduction of parent-managed accounts on WhatsApp, aimed at enabling safer use for young people under 13. Parents or guardians can link their phone to the child's device by placing them next to each other, granting control over who can contact the child and which groups they can join.

The accounts are limited to messaging and calling, excluding features such as Channels, location sharing, and Meta AI integration. By default, only saved contacts can send messages to the managed account. Message requests from unknown contacts appear first to the parent, who must approve group joins or invites from strangers. These notifications ensure parental oversight, and the accounts are protected by a PIN, with privacy settings adjustable only by the managing adult from their device.

All conversations on these accounts benefit from WhatsApp's end-to-end encryption, maintaining privacy from external access. Meta did not specify a minimum age for these accounts but plans a gradual rollout over the coming months. Step-by-step setup instructions are available for users.

This move builds on Meta's recent efforts to enhance parental controls across its platforms. In September, the company launched teen accounts for users aged 13 to 15 on Facebook and Messenger, allowing parents to review requests and apply stricter privacy options. A year prior, under-16 accounts became mandatory on Instagram with similar safeguards. Earlier in 2026, Meta temporarily halted teen interactions with its AI chatbot characters after reports of inappropriate conversations with minors.

These developments reflect Meta's ongoing focus on child safety amid growing scrutiny of social media's impact on young users.

Articoli correlati

Illustration of Discord users facing mandatory teen settings and age verification prompts amid privacy backlash.
Immagine generata dall'IA

Discord defaults all users to teen settings with age verification

Riportato dall'IA Immagine generata dall'IA

Discord announced it will default all accounts to a teen-appropriate experience starting in early March, requiring age verification to access adult content and restricted servers. The move aims to enhance child safety but has sparked backlash over privacy concerns following a recent data breach. Verification options include on-device facial estimation or submitting government IDs.

Meta, the parent company of the three platforms, has announced plans to trial premium subscription services that may charge users. This could reshape social media interactions. Free access will remain, with added features for subscribers.

Riportato dall'IA

Meta Platforms' Japanese arm has announced that Instagram will add a new feature in Japan this year, notifying parents if children aged 13-17 repeatedly search for suicide or self-harm content on the app. This requires parents to link their accounts to their child's. Additionally, it will soon introduce restrictions on access to posts about drugs and dangerous behavior.

Meta will discontinue end-to-end encrypted messaging in Instagram direct messages after May 8, 2026, due to low adoption. Affected users will receive in-app instructions to download their messages and media. The company directs users to WhatsApp for continued encrypted messaging.

Riportato dall'IA

Diversi paesi hanno implementato o discusso misure per limitare l'accesso dei bambini e degli adolescenti ai social media, citando impatti sulla salute mentale e sulla privacy. In Argentina, gli esperti sottolineano la necessità di educazione digitale e regolamentazioni strutturali oltre a semplici divieti. La questione coinvolge non solo la protezione dei minori ma anche il modello di business delle piattaforme basato sui dati.

A jury in New Mexico ruled Meta liable for violating the state's consumer protection laws, ordering the company to pay a $375 million penalty. The verdict stems from allegations that Meta misled users about platform safety amid child exploitation risks. Meta plans to appeal the decision.

Riportato dall'IA

Discord announced new default settings on February 9 to enhance age-appropriate experiences, set to implement in March. The 'teen-by-default' policy requires age verification for accessing sensitive content and features. Users have expressed concerns over privacy and potential data breaches.

 

 

 

Questo sito web utilizza i cookie

Utilizziamo i cookie per l'analisi per migliorare il nostro sito. Leggi la nostra politica sulla privacy per ulteriori informazioni.
Rifiuta