Facebook, Instagram limiting more content for teens as regulatory pressure mounts

Meta Platforms said Tuesday it would hide more content from teens on Instagram and Facebook, after regulators around the globe pressed the social media giant to protect children from harmful content on its apps.

All teens will now be placed into the most restrictive content control settings on the apps and additional search terms will be limited on Instagram, Meta said in a blogpost.

The move will make it more difficult for teens to come across sensitive content such as suicide, self-harm and eating disorders when they use features like Search and Explore on Instagram, according to Meta.

Mark Zuckerberg’s Meta said the measures, expected to roll out over the coming weeks, would help deliver a more “age-appropriate” experience.

Meta is under pressure both in the US and Europe over allegations that its apps are addictive and have helped fuel a youth mental health crisis.

Attorneys general of 33 US states including California and New York sued the company in October, saying it repeatedly misled the public about the dangers of its platforms.


Facebook and Instagram logos
Meta is under pressure both in the US and Europe over allegations that its apps are addictive and have helped fuel a youth mental health crisis. AP

In Europe, the European Commission has sought information on how Meta protects children from illegal and harmful content.

The regulatory pressure followed testimony in the Senate by a former Meta employee who alleged the company was aware of harassment and other harms facing teens on its platforms but failed to act against them.

The employee, Arturo Bejar, called for the company to make design changes on Facebook and Instagram to nudge users toward more positive behaviors and provide better tools for young people to manage unpleasant experiences.


CEO Mark Zuckerberg
Mark Zuckerberg’s Meta said the measures would help deliver a more “age-appropriate” experience. AFP via Getty Images

Bejar said on Tuesday that Meta’s changes did not address his concerns, the company was relying on “‘grade your own homework’ definitions of harm” and still did not offer a way for a teen to easily report an unwanted advance.

“This should be a conversation about goals and numbers, about harm as experienced by teens,” he told Reuters.

Children have long been an appealing demographic for businesses, which hope to attract them as consumers at ages when they may be more impressionable and solidify brand loyalty.

For Meta, which has been in a fierce competition with TikTok for young users in the past few years, teens may help secure more advertisers, who hope children will keep buying their products as they grow up.