We’ve created this guide to help parents navigate the controls offered by popular social media companies.
© 2024 TechCrunch. All rights reserved. For personal use only.
Parental controls are offered by almost every popular media network, but many parents aren’t aware of them. Fewer than 10% of teens on Instagram had enabled parental control settings by the end of 2022 and only a single-digit percentage of parents had used the controls, according to a Washington Post report from earlier this year.
In response to concerns from Congress and rights groups about the potential harm social media inflicts on young users, tech companies have long argued that parental controls they offer protect kids. But because the parental controls aren’t on by default, they do little to protect users unless parents actually enable them.
Each platform approaches parental controls a bit differently, but most of them start by allowing parents to monitor who their teen is communicating with. Some social media platforms then go a bit further by allowing parents to intervene in how their teen uses an app.
TikTok appears to be the platform that gives parents the most control over their teens’ usage. The ByteDance-owned company has faced significant scrutiny from lawmakers, arguably more so than any other platform in this list. In an attempt to win over lawmakers, the app offers parental controls that are much more advanced than those on Instagram, Snapchat and others.
And while most social media platforms offer some sort of parental controls, some have had them for longer than others. Meta has faced scrutiny for its potential negative effect on teens and young users for over a decade, which is why it’s had parental controls for many years, whereas a platform like Discord has been able to fly under the radar and has only recently introduced parental controls.
Before getting into the controls, it’s important to recognize that teens can also create secret accounts, and that most of the parental controls on social networks rely on communication from both the parents and their teen.
We’ve created this guide to make it easier for parents to navigate and understand the parental controls offered by popular social media companies, and we’ve detailed how they vary from platform to platform.
Image Credits: Instagram
Meta-owned Instagram offers parental controls through its Family Center offering. The social network gives users the option to create a “supervised account” for teens between the ages of 13 and 17. Both the teen and parent must agree to participate.
In the Family Center, parents and guardians can supervise their teen’s account by seeing how much time they’re spending on the social network. Parents can intervene in their teen’s usage of the app by setting daily time limits or adding scheduled breaks. With this feature, parents can make sure their teen is only spending a certain amount of time on the app and isn’t using it during homework or school time.
They can also see their teen’s following and followers lists in order to monitor who can view their posts and message them. Parents can also see any reports that their teen has submitted to Instagram.
Plus, parents can see their teen’s account privacy settings and sensitive content settings, along with their DM settings. They can discuss these settings with their teen to help ensure they are protected.
Image Credits: TikTok
Like Instagram, TikTok lets parents link their account to their teen’s with its “Family Pairing” feature. After doing so, parents can decide how much time their teen can spend on the app each day. They can set their teen’s screen time limit and get a summary of how much time their teen spends on the app.
The app also lets parents mute their teen’s push notifications (TikTok mutes notifications for teens between the ages of 13 and15 from 9 p.m. to 8 a.m. by default). Parents can also choose to pause their teens’ notifications for a custom amount of time.
TikTok lets parents take an additional step that other platforms on this list don’t: Parents can limit specific types of content. They have the option to select keywords or hashtags to exclude specific content from their teen’s For You and Following feeds. They can also enable a “Restricted Mode” that will automatically restrict their teen’s exposure to unsuitable or inappropriate content.
In addition, parents can decide whether their teen can search for videos, hashtags or live videos. Plus, they can control whether their teen’s account can be recommended to others on the app. Parents can decide who can comment on their teens’ videos and who can view the content they like.
In terms of DMs, parents can restrict who can message their teen or turn off direct messaging altogether. It’s worth noting that DMs on TikTok are only available to accounts belonging to users who are 16 and older.
Image Credits: Snapchat
Snapchat offers parents access to a “Family Center” that lets them monitor some of their teen’s activity on the app. Parents have to create their own Snapchat accounts and then connect it with their teen’s.
Once parents pair the two accounts, they can see who their child is friends with on Snapchat. They can also get a glimpse of who they’ve messaged in the last seven days. Plus, parents can see a complete list of members in Groups that their teen has been active in over the last week.
It’s worth noting that parents can’t see the messages their teen has shared; they can only see a list of people that their teen has recently messaged.
Like on TikTok, parents can limit their teen’s ability to view sensitive content in Stories and Spotlight.
If parents come across an account that they’re concerned about, they can report it to Snapchat’s Trust and Safety team. However, unlike on TikTok and Instagram, parents can’t monitor or limit much time their child is spending on the app.
Image Credits: Discord
Discord offers a “Family Center” that lets parents monitor their child’s activity on the platform. After enrolling in Family Center, parents receive a weekly email summary containing information about their teen’s activity. Although parents will be able to see which Discord communities and users their teens are talking to, they won’t be able to see the contents of the conversations themselves.
Parents can see their teen’s recently added friends, including their display names and avatars. They can also see which users their teen has messaged or called in direct or group messages, including the times of the last message or call.
Plus, parents can see which servers their teen joined or participated in, including server icons, and server member counts.
Although Discord is regularly used by a young audience, the platform was until recently largely left out of the conversation around social networks and their potential to harm children. In the past, Discord was able to sit on the sidelines while Congress grilled Instagram, Snapchat, TikTok and Facebook. However, Discord was asked to testify in Congress’ most recent hearing on child exploitation.
Image Credits: Meta
You may be surprised to see Facebook on this list because it’s largely known for being a social network for older people. However, while it may seem like teens aren’t using Facebook, recent reports suggest that young people are still using the Meta-owned platform.
Parents can access Facebook’s supervision controls in the same place they monitor their child’s activity on Instagram. They can see how much time their teen has spent on the Facebook app each day over the past week, along with their average daily time spent for the week. To control their teens’ time spent on the app, parents can set scheduled breaks.
In addition, parents can see their teen’s Facebook friends, along with their privacy settings and content preferences. They can also see the people and pages their teen has blocked.
X, formerly known as Twitter, is the only social media platform on this list that doesn’t offer any parental controls. While Twitter prohibits users under 13 on the platform, many lie about their age to join the platform.
Compared to other social media platforms, X is drowning in adult content. The company has also relaxed its hate speech filters since being acquired by Elon Musk and is significantly more lenient around cyberbullying and hateful content compared to the rest of the platforms on this list.
And while the majority of teens don’t use X, a Pew Research study from 2023 found that 23% of teens have used the social network, which is still a significant number, especially on a platform that isn’t doing anything to protect them.
X, like Discord, has been able to fly under the radar when it comes to Congress’ concerns about protecting children online. However, the company was part of Congress’ hearing on child exploitation earlier this year.
Leave a Reply