Listen Live
Meta Platforms Illustration
Source: NurPhoto / Getty

Instagram users under 16 won’t be able to livestream or unblur nudity in direct message without parental approval.

With this new measurement, it will widened safety measures for teenagers. The social media company also said it was extending safeguards for users under 18 to Facebook and Messengers. The new changes will first roll out in the United States, Britain, Canada, and Australia, before going global.

The new changes include teens under 16 blocked from using Instagram Live unless parents give permission. They also need permission to “turn off our feature that blurs images containing suspected nudity” in direct messages. The blurring feature was introduced in April 2024 in attempt to combat the sexual extortion of teens.

Meta launched a teen account program for Instagram in September to gives parents more options to supervise their children’s online activity. Another major update that Meta said was they are extending the account safeguards to its Facebook and Messenger platforms. The safeguards include setting teen accounts to private by default, blocking private messages from strangers, strict limits on sensitive content such as fight videos, reminders to get off the app after 60 minutes and notifications that are halted during bedtime hours.

Meta and other social media companies have faced numerous backlash about how social media affects the lives of young people. In late 2023, 33 states sued Meta, alleging it deliberately engineered its platforms to be addictive to children and teens. The lawsuits claim that profits motivated the company to do so. It also alleged that Meta routinely collects data on underage users without their parents consent, which is a violation of a federal law.

An email from Meta stated that at the time, Meta was determined to provide teens with “safe, positive experiences online, and have already introduced over 30 tools to support teens and their families.”