Meta Platforms is rolling out special accounts with new privacy settings for teenaged Instagram users, it said on Tuesday, its latest effort to limit their exposure to harmful content on its apps amid regulatory pressure.
The social media firm said it will port all designated accounts automatically to teen accounts, which will be private accounts by default.
Users of such accounts can only be messaged and tagged by accounts they follow or are already connected to, while sensitive content settings will be dialed to the most restrictive available.
Users under 16 years of age can change the default settings only with a parent’s permission. Parents will also get a suite of settings to monitor who their children are engaging with and limit their use of the app.
Several studies have linked social media use to higher levels of depression, anxiety and learning disabilities, particularly in young users.
Meta, ByteDance’s TikTok and Google’s YouTube already face hundreds of lawsuits filed on behalf of children and school districts about the addictive nature of social media. Last year, 33 U.S. states including California and New York sued the company for misleading the public about the dangers of its platforms.
Top platforms, including Facebook, Instagram and TikTok, allow users who are 13 years of age and above to sign up.
Meta’s move comes three years after it abandoned development on a version of the Instagram app meant for teenagers, after lawmakers and advocacy groups urged the company to drop it, citing safety concerns.
In July, the U.S. Senate advanced two online safety bills – The Kids Online Safety Act and The Children and Teens’ Online Privacy Protection Act – that would force social media companies to take responsibility for how their platforms affect children and teens.
Meta said it will place the identified users into teen accounts within 60 days in the U.S., UK, Canada and Australia, and in the European Union later this year. Teens around the world will start to get teen accounts in January.
(Reuters)