Instagram has introduced a new privacy setting that puts all new and existing underage accounts into auto-private mode by default.
Brandon Bell | Getty Images
meta The social media company announced Tuesday that it will limit the content teen users can see on Instagram to what they would typically encounter in a PG-13 rated movie.
With the new content guidelines, Meta said it will hide certain accounts from youth, including accounts that share sexual content or media related to drugs and alcohol. Additionally, Instagram discourages posts containing swear words, but teens can still search for swear words.
The changes come after the company faced a wave of criticism over its handling of child safety and related mental health concerns on its platform.
As part of the changes, Instagram accounts with names and bios that include links to adult websites such as OnlyFans and liquor stores will be made private to teens, the company said. Teen Instagram users will no longer be able to follow these types of accounts, and if they already follow them, they won’t be able to view or interact with adult content beyond sharing.
Meta executives said in a media briefing that while the company’s previous content guidelines already aligned with or exceeded PG-13 standards, some parents said they were confused about what content their teens could see on Instagram. For clarity, Meta has decided to more strictly standardize its teen content policy with movie ratings that parents can better understand, executives said.
“We decided to more closely align our policies with independent standards that parents are familiar with, so we reviewed our age-appropriate guidelines for PG-13 movie ratings and updated them accordingly,” the company said in a blog post. “While there are of course differences between movies and social media, we made these changes to ensure that the experience for teens in a 13+ setting feels more like the Instagram equivalent of watching a PG-13 movie.”
Social media companies have come under fire from lawmakers who say they are not adequately policing their platforms on child safety-related issues.
The company, then known as Facebook, came under fire in 2021 after the Wall Street Journal published a report citing internal research showing how harmful Instagram was, especially for teenage girls. Other reports show how easily teens can use Instagram to find drugs, including through ads run by the company.
Over the past year, Meta has rolled out several features aimed at providing more transparency to parents about how their teens are using the company’s apps. In July, Meta debuted new safety tools aimed at making it easier for teen Instagram users to block or report accounts, as well as give them more information about who they’re interacting with on the platform.
In August, watchdog the Technology Transparency Project released a report claiming that Meta’s relationship and sponsorship with the National Parent Teacher Association “gives a sheen of professional approval” to Meta’s “efforts to continue to engage young users on its platform.” The national PTA said in a statement that it does not endorse any social media platform, but Meta said at the time: “Like many other technology companies, we are proud to partner with professional organizations to educate parents about our safety tools and protecting teens.”
Meta said Instagram’s new content guidelines will begin rolling out in the U.S., U.K., Australia and Canada on Tuesday, and then expand to other regions.
WATCH: Is an AI bubble bursting?

