Instagram is overhauling its content policy for teenagers, introducing a new system inspired by the “parental guidance” ratings used in the film industry. The social media platform will now automatically place all users under 18 into a more sheltered content environment.
The new policy establishes a “13+” setting as the default for all teen accounts. This move is designed to proactively limit exposure to mature themes. A crucial element of this system is that any attempt by a teen to opt out of these stronger protections will require direct approval from a parent or guardian.
Meta clarified that the new PG-13 equivalent will build upon existing safety measures. It will specifically target and reduce the visibility of content featuring strong language, risky stunts, and themes that could encourage harmful activities. Furthermore, to prevent intentional exposure, the platform is blocking a range of search terms deemed inappropriate for younger audiences.
This change is being implemented as Meta faces continued pressure from safety advocates and regulators. An independent review recently labeled Instagram’s safety tools as largely ineffective, a claim Meta denies. This new, more restrictive system appears to be a direct effort to counter such criticisms and demonstrate a commitment to user safety.
The feature will launch first in the US, UK, Australia, and Canada, with a broader international release planned for early next year. However, campaigners are urging caution, stating that Meta’s track record requires that these new features be judged on their proven effectiveness, which necessitates a level of transparency and independent verification the company has yet to provide.
