Instagram announces new safety features for teens

Photo (c) Nikada - Getty Images

The app is making teens’ accounts private by default and limiting contact from ‘suspicious’ strangers

Instagram has unveiled new safety features to help keep young users safe. Now, new accounts will automatically start out as private for kids under 16. Additionally, some adults will be blocked from interacting with teenagers on the platform, and advertisers will have new rules governing how they can target teenagers. 

“While most platforms have set their minimum age for participation at 13, there’s no on/off switch that makes someone ready to be a fully media-literate participant on that birthday,” said David Kleeman, Senior Vice President, Global Trends. “Defaulting accounts to private for under-16s encourages young people to develop comfort, confidence and capability as digital citizens during their younger years and help them develop habits to last a lifetime.”

Instagram said it previously asked young users to choose between a public account or a private account when they created an account, but research showed that “they appreciate a more private experience.” In testing, eight out of ten young people kept the private default settings after signing up for an account

The social media platform said teenage users who already have a public account will be shown a notification “highlighting the benefits of a private account and explaining how to change their privacy settings.” They’ll still have the option of keeping their account public. 

“We think private accounts are the right choice for young people, but we recognize some young creators might want to have public accounts to build a following,” the company said. 

Blocking ‘unwanted contact’ from adults

Teenage Instagram users will also be shielded from being contacted by adults who have shown “potentially suspicious behavior,” such as having previously been blocked or reported by young people. 

Instagram said it wants teens to be able to make new friends and keep up with their family, and it doesn’t want them to deal with unwanted direct messages or comments from people they don’t know. Individuals who have exhibited suspicious behavior will have limited ability to interact with and follow teens.

"We want to ensure that teens have an extra barrier of protection around them out of an abundance of caution," said Karina Newton, Instagram's head of public policy.

Protecting young users

Instagram has maintained that keeping young users safe is a top priority. However, the Facebook-owned app has faced criticism over its idea of a separate “Instagram for Kids” app. Lawmakers and child safety advocates have raised concerns about the impact that such an app could have on children’s safety, privacy, and mental health. 

But on Tuesday, Facebook said it still intends to build an Instagram platform for kids under 13. It also announced new updates to allay safety concerns. The company said its new Instagram experience for tweens would be managed by parents and guardians in order to "reduce the incentive for people under the age of 13 to lie about their age."

"The reality is that they're already online, and with no foolproof way to stop people from misrepresenting their age, we want to build experiences designed specifically for them, managed by parents and guardians," Instagram said in a blog post. 

Take an Identity Theft Quiz. Get matched with an Authorized Partner.