Senator Coons, colleagues introduce bipartisan Kids Online Safety Act to protect children from online risks

May 4, 2023

WASHINGTON – U.S. Senator Chris Coons (D-Del.) introduced the bipartisan Kids Online Safety Act alongside Senators Richard Blumenthal (D-Conn.) and Marsha Blackburn (R-Tenn.) to protect children on social media platforms.                     

“Social media platforms provide tremendous value to society by connecting the world and boosting economic growth, but without proper safeguards, our children are too often exposed to objectionable, harmful, or outright dangerous content,” said Senator Coons. “We need more transparency and information about the impact these platforms have on our children, but we can start with the commonsense steps contained in this bill to ensure that platforms do their part to protect minors who use their services.”

The Kids Online Safety Act provides young people and parents with the tools, safeguards, and transparency they need to help protect against dangerous online content. The legislation requires independent audits by experts and academic researchers to ensure that social media platforms are taking meaningful steps to address risks to kids.

Senator Coons has been a longstanding advocate for transparency from social media platforms. Last year, he introduced the bipartisan Platform Accountability and Transparency Act (PATA), which would allow qualified researchers access to platform data to better understand the impact social media companies have on society.

The Kids Online Safety Act is supported by hundreds of advocacy and technology groups, including Common Sense Media, the American Psychological Association, American Academy of Pediatrics, American Compass, Eating Disorders Coalition, Fairplay, Mental Health America, and Digital Progress Institute.

The Kids Online Safety Act:

    Requires that social media platforms provide minors with options to protect their information, disable addictive product features, and opt out of algorithmic recommendations. Platforms would be required to enable the strongest settings by default.

    Gives parents new controls to help support their children and identify harmful behaviors, and provides parents and children with a dedicated channel to report harms to kids to the platform.

    Creates a responsibility for social media platforms to prevent and mitigate harms to minors, such as promotion of suicide, eating disorders, substance abuse, sexual exploitation, and unlawful products for minors (e.g., gambling and alcohol).

    Requires social media platforms to perform an annual independent audit that assesses the risks to minors, their compliance with this legislation, and whether the platform is taking meaningful steps to prevent those harms.  

    Provides academic and public interest organizations with access to critical data sets from social media platforms to foster research regarding harms to the safety and well-being of minors.