On April 20, 2026, a delegation of approximately 60 parents and digital rights advocates arrived on Capitol Hill to initiate a renewed lobbying effort for federal online safety legislation. Organized by groups including Parents RISE! and the Transparency Coalition, the advocates are pushing for the immediate passage of the Kids Internet and Digital Safety (KIDS) Act and the Children and Teens’ Online Privacy Protection Act, commonly known as COPPA 2.0. This mobilization follows a series of significant legal and legislative developments in early 2026 that have shifted the regulatory landscape for social media and artificial intelligence companies.

The advocates specifically highlighted the role of generative AI in facilitating new forms of harm. Technical concerns presented to lawmakers focused on the proliferation of AI-generated deepfakes and the use of nudify applications targeting minors. The group cited the recent federal lawsuit against xAI regarding its Grok platform as evidence that current safety frameworks are insufficient to manage automated content generation. According to data from the Childlight Global Child Safety Institute cited during the briefings, technology-facilitated child abuse cases in the United States rose from 4,700 in 2023 to more than 67,000 in 2024, a trend advocates attribute to the integration of AI tools into mainstream social platforms.

Legislative progress has reached a critical juncture as of April 2026. The KIDS Act (H.R. 6291) was advanced by the House Energy and Commerce Committee in March, incorporating provisions from the Kids Online Safety Act (KOSA) such as a mandatory duty of care for platforms. This provision would require services to disable addictive design features and provide an opt-out for personalized algorithmic recommendations by default for users under 17. Simultaneously, the Senate unanimously passed COPPA 2.0 on March 5, 2026, which would ban targeted advertising to minors and implement an eraser button for the deletion of personal data. However, a primary point of contention remains the issue of state preemption, with advocates urging federal lawmakers not to undermine more stringent protections already passed in over 40 states.

The coalition is also leveraging recent judicial precedents to bolster their case. On March 25, 2026, a jury in Los Angeles found Meta and YouTube liable for addicting and harming a young user, a ruling that advocates claim demonstrates the legal feasibility of holding platforms accountable for product design defects. During their meetings with members of the Senate Commerce Committee and the House Energy and Commerce Committee, the parents emphasized that federal legislation must codify these safety-by-design standards to ensure uniform protection across all digital services with more than 10 million monthly active users.