Instagram and Messenger Give Parents More Control: All the New Tools

New parental control tools come amid a teen mental health crisis some experts say has been fueled by social media.

Nina Raemont Writer
A recent graduate of the University of Minnesota, Nina started at CNET writing breaking news stories before shifting to covering Security Security and other government benefit programs. In her spare time, she's in her kitchen, trying a new baking recipe.
Nina Raemont
2 min read
Sarah Tew/CNET

Looking for a way to filter the social media consumption of your teenagers? Meta has announced new features for Messenger and Instagram that are designed to give parents more oversight over messaging apps and social media.

Meta, which owns Facebook, Instagram and WhatsApp, on Tuesday outlined parental supervision tools coming to chat app Messenger. The features give parents a look into how much time their teen spends on Messenger; who their teen is interacting with or can interact with; their privacy and safety settings; and who their teen reports on the app (if their teen chooses to disclose that information). The parental supervision tool doesn't let parents see messages their teen sends to others.

Meta is rolling out the tools in Messenger in the US, the UK and Australia on Tuesday, and said they'll come to more countries in the "coming months." Meta also teased additional time- and interaction-managing parental supervision tools it'll roll out on Messenger over the next year. 

Alongside the new Messenger features, Meta is also testing privacy messaging features on Instagram direct messages. 

"We want to protect people from unwanted interactions in Instagram DMs, and these protections are especially important when it comes to teens," it said in the release. 

Meta is testing a feature that would make people send an invite to a person before they can engage in messaging. People can send only one invite at a time and are prohibited from spamming a person's messages before the person accepts an invitation to connect. Photos, videos and any other type of media can't be sent before the person accepts the invite. 

The company is also introducing its Take a Break feature, which Instagram already uses, into Facebook. After 20 minutes on the app, teens will receive a message telling them to step away from the app and set daily time limits. 

The announcement of heightened parental controls and in-app screen time limits comes amid a teen mental health crisis and loneliness epidemic that some experts are tying to excessive social media use. Some schools are even suing tech firms like Meta and TikTok, alleging that they've played a large part in the youth mental health crisis. 

"At this point, we do not have enough evidence to say with confidence that social media is sufficiently safe for our kids," Surgeon General Vivek Murthy said in an interview with NBC News last month. "We have to now take action to make sure that we are protecting our kids."

Meta-owned WhatsApp has also been boosting privacy protections recently, including automatically silencing unknown callers.