Ask Titania: Are the New Instagram Parental Controls Any Good?
Dear Titania,
This week, I read about Instagram’s new parental controls. I wanted to know if these are any different from all of the other updates they’ve made to Family Center. My daughter turns 15 soon — our agreed upon age for getting social media — so I’ve been weighing the pros and cons of considering letting her make an account. Is this new update actually putting control in a parent’s hands?
Signed,
Inquiring About Instagram
Dear Inquiring About Instagram,
First, I want to commend you for waiting as long as you have to grant permission for social media. These companies all say 13 is the minimum age, but I pretty strongly disagree with that. Waiting to allow apps like these is never, ever a bad idea.
So, Instagram. One of the most popular apps in the entire world, and it has been for more than a decade. While kids are also drawn to Snapchat and upstarts like TikTok, Instagram is still a mainstay for young (and old) alike. As you probably know, it’s more than just a place to display pretty, filtered photos. It’s become a place to chat with friends, post stories, follow celebrities and politicians, and even get news.
But it’s also a place where inappropriate content runs rampant, disappearing messages fly back and forth, predators lurk, misinformation spreads, and cyberbullying occurs on a fairly regular basis.
The TL;DR for your answer is: Kinda, but not really. Instagram has made their strongest move yet for online protections, but it’s not perfect. I’ll get into all of the details below.
The Launch of Teen Account Parental Controls
This both is and isn’t Instagram’s first parental control rodeo. Over the past few years — actually, just two, since Family Center launched in 2022 — they’ve been slowly adding in (relatively) ineffective efforts at helping to protect kids on their platform. With Family Center, parents can link their Instagram accounts to their child’s for “supervision.” The problem with this? Kids can turn it off at literally any time, which makes it next to useless.
But this week’s announcement was the boldest yet — teens under 16 will have to have their parent’s permission to remove the new protections Instagram is automatically placing on their accounts.
While it may be a case of too little, too late, any steps Meta can take to reduce the rate of predation and sextortion on their platforms are meaningful and much appreciated. This stuff is hard, and we didn’t know a lot of things back then that we know now.
Caveat 1: Your Teen Has to Actually Be a Teen on Instagram
This is less important in your situation since your child hasn’t had an account yet, but Instagram hasn’t always been great at making sure teens are actually signing up for teen accounts. There’s historically been no real age-gating on the front end, meaning that there are undoubtedly tons of teens that won’t be placed into teen accounts because Instagram thinks they're adults. Lately, however, if you do wish to change your age — whether up or down — you have to actually contact Instagram to make it go through.
Since your daughter will be creating a brand new account, though, make sure she uses her actual birthdate.
Caveat 2: Even If Your Teen Is a Teen on Instagram, They Still Have to Agree to Supervision
Instagram is touting all these new protections on their Family Center and that parents must approve any deviations from the automatic safeguards. But the missing piece to this puzzle? Two HUGE things:
First, your child has to agree to your supervision. Full stop. You can’t automatically implement it and you can’t passcode protect it.
And two: They can turn off supervision at any time. This could work for families where there’s a certain level of trust, but honestly the risks are too high.
The New Rules
Let’s dive into the new features that have been making the news lately. Existing teens under 16 — as well as new teens under 16 that sign up —will automatically be placed into “Teen accounts,” which will have the following safeguards:
Default private accounts
Private accounts are generally a good rule of thumb for all Instagram users, especially for kids so creepers can’t view photos of them or any identifiable info like school name or house address.
But also not everything is truly private, unfortunately. If a child accepts a scammer’s friend requests, the child’s entire follower list then becomes exposed — which leads to large-scale extortion scams.
DMs will have restrictions
Teens will only be able to chat with people they follow or are already connected to. This is a good step at preventing randos from DMing kids, but it leaves out the possibility that these same people may eventually get acquainted with kids, like when a teen gets a gamer’s Instagram handle in a Call of Duty chat.
Less sensitive content shown in Reels and Explore
The Instagram Explore section can be a pretty toxic place, and the company even acknowledges this. Teens will have less of a chance to encounter sensitive content that promotes eating disorders, cosmetic procedures, violence, and other topics that may be harmful or affect self-esteem. There’s a fine line between protecting kids and censorship, but there are some topics that don’t deserve a platform on social media.
Limited interactions
Teens can only be tagged or mentioned by friends they’re connected to. Instagram is also automatically enabling their anti-bullying feature, Hidden Words, so that offensive words and phrases will be filtered out of teens’ comments and DM requests.
Time limit reminder notifications
This one seems like wishful thinking — teens will get an alert after they’ve been on the app for an hour. Unless there’s a physical lock-out feature though, this is just suggestive. I know that, personally, I’ve set a one hour per day limit for myself on Instagram and I still blow through that on the regular. If I were under 18? Probably even more so, with or without a “suggestion.”
Sleep mode
Again, another nice idea, but if a teen wants to scroll on Instagram instead of sleep, there’s nothing in this feature that will prevent that. It feels like a really nice thing to say in a press release the day before a senate committee meeting - ahem - but it’s not actually effective.
Try Device-Level Monitoring and Protection with the Bark Phone
Instagram, it seems, is trying to help make their platform a better place, but unfortunately these dangers have been around for years now, putting countless children in harm’s way. I’m glad they’re making movement but these steps are still not enough.
After all, there’s only so much they can do when there’s so many different ways kids can create accounts and access them.Even though the term “Finsta” or fake Instagram doesn’t really get thrown around anymore, they still exist. A kid could agree to supervision on their “main” account and then have a separate account they created on a friend’s phone and log into occasionally on their own.
That’s where Bark comes in — no matter what Instagram account your child is logged in to if they have a Bark Phone (teen or not, known to you or not), Bark will monitor it for potential dangers, specifically direct message text and search bar queries.
The Bark Phone also lets you implement screen time rules that actually cut off access to an app once a time limit is reached — and not just an easy-to-swipe-away reminder.
At the end of the day, if your daughter is going to want to get around some of the Instagram restrictions in place, it’s possible that she could find a way. That’s why it’s more important to sit down with her and talk about everything you’re concerned about. You can always start off very limited — like only 20 minutes of Instagram time while the family is all together — and go from there. Good luck!