Social
Instagram updates privacy settings to protect kids from adults
Meta says most accounts featuring children are used in innocent ways, such as by parents sharing family life or managers promoting young talent.

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.
Instagram is rolling out new safety features aimed at protecting children from potentially harmful adults on its platform.
One major update is that adult-run accounts that mainly post pictures of children, such as those managed by parents or talent agents, will no longer be recommended to suspicious adult users.
It comes after a 2023 lawsuit accused the platforms of enabling predators by allowing child sexual abuse material to spread and by promoting disturbing content through their recommendation systems.
An investigation by The Wall Street Journal also found that Instagram’s algorithm was helping pedophiles connect and share content.
To combat these serious concerns, Meta has introduced new protections. These include blocking suspicious adults (like those blocked by teens) from seeing or interacting with child-centered accounts.
Instagram will also hide comments from these users and make it harder for them to find or be found by such accounts in searches.
Meta says most accounts featuring children are used in innocent ways, such as by parents sharing family life or managers promoting young talent.
But the company has faced criticism for allegedly allowing some parents who exploit their children for money to continue using the platforms.
This new step builds on previous safety measures, like the removal of the option for child-focused accounts to earn money through gifts or subscriptions.
Other protections coming soon will mirror those already in place for teen users. For example, child-centered accounts will be set to the strictest messaging settings by default.
Instagram will also filter out offensive comments automatically and provide easier ways to block and report people.
Teen users will now also see when someone joined Instagram, which helps them spot fake or suspicious accounts.
These new tools are meant to give teens, and now adults managing kids’ content, more control and awareness when interacting on the platform.
Do you think Instagram’s new safety measures will effectively protect children from predators on the platform? Or do these changes not go far enough to address the fundamental risks of social media for kids? Tell us below in the comments, or reach us via our Twitter or Facebook.
Follow us on Flipboard, Google News, or Apple News
