Meta is rolling out tighter teen messaging limitations and parental controls

Meta announced today that it is rolling out new DM restrictions on both Facebook and Instagram for teens that prevent anyone from messaging teens. Until now, Instagram restricts adults over the age of 18 from messaging teens who don’t follow them. The new limits will apply to all users under 16 — and in some […]
© 2023 TechCrunch. All rights reserved. For personal use only.

Meta announced today that it is rolling out new DM restrictions on both Facebook and Instagram for teens that prevent anyone from messaging teens.

Until now, Instagram restricts adults over the age of 18 from messaging teens who don’t follow them. The new limits will apply to all users under 16 — and in some geographies under 18 — by default. Meta said that it will notify existing users with a notification.

Image Credits: Meta

On Messenger, users will only get messages from Facebook friends, or people they have in their contacts.

What’s more, Meta is also making its parental controls more robust by allowing guardians to allow or deny changes in default privacy settings made by teens. Previously, when teens changed these settings, guardians got a notification, but they couldn’t take any action on them.

The company gave an example that if a teen user tries to make their account public from private, changes the Sensitive Content Control from “Less” to “Standard,” or attempts to change controls around who can DM them, guardians can block them.

Image Credits: Meta

Meta first rolled out parental supervision tools for Instagram in 2022, which gave guardians a sense of their teens’ usage.

The social media giant said that it is also planning to launch a feature that will prevent teens from seeing unwanted and inappropriate images in their DMs sent by people connected to them. The company added that this feature will work in end-to-end encrypted chats as well and will “discourage” teens from sending these types of images.

Meta didn’t specify what work it is doing to ensure the privacy of teens while executing these features. It also didn’t provide details about what it considers to be “inappropriate.”

Earlier this month, Meta rolled out new tools to restrict teens from looking at self-harm or eating disorders on Facebook and Instagram.

Last month, Meta received a formal request for information from the EU regulators, who asked the company to provide more details about the company’s efforts in preventing the sharing of self-generated child sexual abuse material (SG-CSAM).

At the same time, the company is facing a civil lawsuit in the New Mexico state court, alleging that Meta’s social network promotes sexual content to teen users and promotes underage accounts to predators. In October, more than 40 US states filed a lawsuit in a federal court in California accusing the company of designing products in a way that harmed kids’ mental health.

The company is set to testify before the Senate on issues around child safety on January 31 this year along with other social networks including TikTok, Snap, Discord, and X (formerly Twitter).

 


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *