Meta Platforms' Japanese arm has announced that Instagram will add a new feature in Japan this year, notifying parents if children aged 13-17 repeatedly search for suicide or self-harm content on the app. This requires parents to link their accounts to their child's. Additionally, it will soon introduce restrictions on access to posts about drugs and dangerous behavior.
Instagram will use its “Teen Accounts” feature, which limits certain functions for users aged 13-17 allowed under the app’s terms of use, to notify parents via the app or email if children repeatedly search for suicide-related content. This feature, already available in the United States and Britain, is being introduced in Japan for the first time this year. Parents must link their accounts to their child’s for it to work.
Additionally, Instagram will soon add a feature restricting teens up to age 17 from viewing posts with drug-related content, extreme language such as threats, and dangerous acts like shooting guns. The platform already limits displays of posts containing sexual imagery or relating to alcohol or tobacco.
Social media enables easy communication with friends but has raised global concerns about leading to bullying and suicide. In the United States, lawsuits have been filed against operating companies. In Australia, a law banning social media use by those under 16 took effect in December last year.
These steps aim to further protect children amid ongoing discussions about online safety.