'Hocus Pocus 2' Review Wi-Fi 6 Router With Built-In VPN Sleep Trackers Capital One Claim Deadline Watch Tesla AI Day Student Loan Forgiveness Best Meal Delivery Services Vitamins for Flu Season
Want CNET to notify you of price drops and the latest stories?
No, thank you
Accept

Meta Fined $400 Million for Failing to Protect Children's Privacy on Instagram

The company says it will appeal the fine from Ireland's Data Protection Commission.

Meta logo on a phone screen, with the EU flag in the background
Meta faces a hefty GDPR fine for its handling of children's data.
James Martin/CNET

Meta is facing a fine of 405 million euros, or just over $400 million, from Ireland's Data Protection Commission for failing to safeguard children's information on Instagram. 

The country's data watchdog had accused Instagram of setting children's accounts to "public" by default and allowing children to operate business accounts on the platform, which could leave their phone numbers and email addresses exposed.

Full details of the decision are expected to be published next week, a commission spokesman said Tuesday. 

Meta confirmed the fine and said it plans to appeal the decision. 

"This inquiry focused on old settings that we updated over a year ago, and we've since released many new features to help keep teens safe and their information private," a Meta spokesperson told CNET via email on Tuesday. "Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can't message teens who don't follow them." 

The spokesperson added that Meta disagrees with how the fine was calculated and is reviewing the rest of the commission's decision.

The $400 million fine would be the second-largest issued to a tech company for a violation of the European Union's General Data Protection Regulation, behind Amazon's record-setting $888 million fine in 2021.

This isn't the first time Instagram has faced scrutiny over how it protects children using the platform. A child advocacy group found in April that Instagram promoted pro-eating disorder content to children as young as nine. Last year, members of Congress wrote a letter to Instagram, urging it to "cease all efforts" to launch a kids version of the app, after internal research indicated that Instagram can be harmful to young users. The platform has since introduced tools to help protect children on the platform, including an age verification tool