Instagram revising teen account settings amid backlash
Instagram is overhauling how it treats teenage accounts, rolling out a PG-13 style filter and other default safeguards aimed at younger users. Meta, the platform’s parent, announced the changes this week after rising criticism over how the app handles teens. The company says the new settings will limit exposure to explicit and violent material by default.
The move builds on changes Instagram launched last year when it introduced dedicated teen accounts. In September 2024 the app made accounts for users under 18 private by default and began hiding messages from unknown senders. Meta frames the PG-13 framework as the next layer of protection on top of those earlier defaults.
Under the updated approach, Instagram will group what teens see using standards similar to those used for PG-13 films, with filters that keep potentially offensive or violent posts out of a teen’s feed. Material that remains visible to adults may be suppressed or tagged when viewed by teen accounts. Any opt-out from those protections will require a parent’s explicit permission.
A key focus is keeping teens from following accounts that routinely post sexually suggestive material. Teen profiles will be blocked from following users who regularly post suggestive images or who advertise explicit services such as OnlyFans in their bios. The policy aims to reduce easy access to adult-oriented accounts.
Search and Explore results for teen profiles will be tightened so recommendations skew less adult, and Instagram said its AI tools will avoid giving answers that are “out of place in a PG-13 movie.” The changes rely on a mix of automated detection and policy rules. Meta describes the package as a layered response rather than a single technical fix.
The announcement follows a public critique led by former employee Arturo Béjar and several nonprofits that argued Instagram’s protections for minors were weak or broken. That report dug into the platform’s teen safety measures and raised questions about how well controls actually worked. Its release intensified calls for clearer, enforceable safeguards.
Specifically, the review examined 47 of Instagram’s 53 teenage safety features and found that most were either ineffective or no longer functioning. Those findings became a focal point for advocacy groups and reporters pressing Meta for faster change. Meta pushed back on the characterization and defended its efforts.
Meta called Béjar’s report “misleading” and “dangerously speculative.” The company said the analysis did not reflect the current state of its systems and stressed that further technical and policy updates are underway. Meta also highlighted that changes would be rolled out in stages to iron out issues.
Meta has begun rolling the revamped teen settings out to users in the United States, Canada, the United Kingdom and Australia this week, and it expects to make the new version available to all users before the end of 2025. The phased rollout is meant to let the company monitor performance and adjust controls as needed. Early results will shape enforcement and fine-tuning.
Parents will gain more say over opt-outs and more control over what teens can see, but observers stress that the real test is enforcement and consistent detection of bad actors. Calls for transparency, independent audits and clear reporting on how safeguards perform are growing louder. Parents should review account settings as the rollout continues.
