Instagram announced Tuesday that it’s launching a new tool called “Sensitive Content Control.” According to Instagram’s parent company, Facebook, the new tool will let users “see more or less of some types of sensitive content” on the app’s Explore page.
The company said it’s aiming to give users the ability to filter out content that may be “upsetting or offensive.” Sensitive content could include sexually suggestive or violent posts, or posts that promote things like tobacco or pharmaceutical use. Users have three levels of sensitive content filtering to choose from: “Allow,” “Limit (Default),” and “Limit Even More.”
"We recognize that everybody has different preferences for what they want to see in Explore, and this control will give people more choice over what they see,” Facebook said in a release.
Improving content settings
Previously, Instagram used a set of recommendation guidelines to prevent users from seeing this type of content. Now the company is giving users the ability to dial up or down the default setting. Alternatively, users over the age of 18 will be able to choose to allow any and all posts that the app classifies as “sensitive.” Instagram said it will still remove posts that violate its guidelines altogether.
“We're constantly making improvements to ensure you have more control over the content you see on Instagram,” the platform said on its support page. “However, the Sensitive Content Control does not apply to content that violates our Community Guidelines.”
The setting will be accessible by going to the Settings menu, selecting the “Account”option, and then selecting “Sensitive Content Control.”