Top 4th of July Sales Best 4K Projectors 7 Early Prime Day Deals Wi-Fi Range Extenders My Favorite Summer Gadgets Cheap Car Insurance Target's 4th of July Sale Best Running Earbuds, Headphones

Kids Online Safety Act Introduced in the Senate

The legislation would require social media platforms to provide more safeguards for children.

US Sen. Marsha Blackburn speaks in October during a hearing about protecting kids online.
Sen. Marsha Blackburn speaks in October during a hearing about protecting kids online. The methods social media companies employ to increase user engagement have come under increased scrutiny.
Samuel Corum/Getty Images

Following a series of congressional hearings concerning the impact of social media on children, senators Richard Blumenthal, a Democrat from Connecticut, and Marsha Blackburn, a Republican from Tennessee, on Wednesday introduced a bill that aims to hold social media platforms responsible for protecting minors age 16 and younger.

The Kids Online Safety Act would require platforms to offer settings that would limit minors' use of the platforms and restrict platforms' use of their personal data. The proposed settings include the ability to opt out of "algorithmic recommendation systems" that pull from a user's personal data to suggest content, like the algorithms TikTok employs to keep users scrolling.

The legislation would also impose on social media platforms a duty of care to prevent harm to minors. The platforms would become legally responsible for shielding minors from harassment, sexual exploitation and the promotion of substance abuse, eating disorders, self-harm and suicide.

"In hearings over the last year, Senator Blumenthal and I have heard countless stories of physical and emotional damage affecting young users, and Big Tech's unwillingness to change," Blackburn said in a release. "The Kids Online Safety Act will address those harms by setting necessary safety guiderails for online platforms to follow that will require transparency and give parents more peace of mind."

Facebook parent Meta and Snapchat parent Snap are facing a lawsuit from a mother who says the companies designed Instagram and Snapchat to be addictive, failed to keep minors safe and thus contributed to her 11-year-old daughter's death by suicide. 

Neither Meta, Twitter nor TikTok immediately responded to requests for comment.