Congress Passes Bill That Could Expand Social Media Regulation for Millions of Americans
For too long, the digital lives of American children have been governed not by safety standards, but by the relentless pursuit of profit by Big Tech. Every scroll, click, and interaction is monitored, analyzed, and exploited by algorithms designed for maximum engagement, often at the expense of children’s mental health. Studies show that children and adolescents who spend more than three hours a day on social media face roughly doubled risk of mental‑health problems including symptoms of depression and anxiety. (HHS.gov) Meanwhile, personal data—the intimate details of children’s lives—is collected, commodified, and shared with little meaningful oversight, creating a landscape where both mental well‑being and privacy are secondary to corporate revenue. In this context, federal intervention is not merely a policy preference—it is a moral imperative. The government must step in to protect children and safeguard the digital rights of all Americans, establishing rules that prioritize human welfare over profit.
The bipartisan push for legislation like the Kids Online Safety Act (KOSA), the Kids Off Social Media Act (KOSMA), and the American Data Privacy and Protection Act (ADPPA) represents a long‑overdue reckoning. These bills collectively aim to establish a new social contract for the digital age, requiring technology companies to place the well‑being of users—and especially minors—above their insatiable appetite for revenue. Each piece of legislation addresses a different facet of the digital crisis: KOSA focuses on the design of platforms and the duty of care toward minors, KOSMA places hard restrictions on access for the youngest users, and ADPPA tackles the broader data privacy and civil‑rights implications of an unregulated online environment.
I. Ending the Algorithm’s Addiction: The New Mandate for Child Safety
The most insidious threat to children online today is not only explicit content or online predators, but the very structure of social media platforms themselves. Companies have weaponized machine learning to track exactly how long a child watches a video, which posts they engage with, and what emotional responses they exhibit, creating personalized, perpetual feeds designed to maximize time spent on the platform. Research shows higher social‑media use correlates with higher rates of self‑harm, loneliness, and psychological distress. (PMC) This hyper‑tailored engagement amplifies harmful content, reinforces negative behaviours, and can exacerbate vulnerabilities such as low self‑esteem, eating disorders, or suicidal ideation.
KOSA and KOSMA represent a dual‑pronged legislative response to this crisis, each targeting a different aspect of the online environment.
1. KOSA: Establishing a “Duty of Care” and User Control
The Kids Online Safety Act sets forth a transformative “duty of care” for platforms serving minors. For the first time, companies are legally responsible for preventing and mitigating harms they know their services can cause to children. The text of KOSA explicitly includes a section on “duty of care” and safeguards for minors. (Congress.gov) This includes content promoting suicidal behaviours, self‑harm, eating disorders, substance abuse, and sexual exploitation. Beyond prohibiting harmful content, KOSA addresses the design elements that make platforms addictive. Companies must provide children and parents with tools to protect personal information, disable addictive features, and most crucially, opt out of algorithmic recommendations. The summary of the bill details these requirements for minors, including providing users with the option to use a chronological feed instead of an engagement‑maximising one. (Tech Policy Press) Allowing minors to select a chronological feed over a personalised, engagement‑driven one gives families a critical measure of control over time and attention, helping to counteract the psychological pull of endless scrolling.
2. KOSMA: Setting Hard Boundaries for the Youngest Users
While KOSA addresses corporate responsibility, the Kids Off Social Media Act (KOSMA) imposes concrete protections for younger children. By banning social media accounts for children under 13 and prohibiting recommendation algorithms for users under 17, KOSMA acknowledges that developing brains are particularly susceptible to psychological manipulation. (Although I couldn’t locate the full final text of KOSMA in my sources, many summaries and news accounts of children’s‑online‑safety legislation reflect this form of age‑based restriction.) The legislation also empowers schools to limit social media access on federally funded networks, allowing classrooms to remain spaces for education rather than distraction.
II. The Uniform Shield: Comprehensive Federal Data Privacy
Beyond child safety, Americans face a nationwide data‑privacy crisis. Personal information—from health records to geolocation to financial behaviour—is collected, analyzed, and sold with minimal oversight. The result is a patchwork of state laws, each with different requirements, leaving both consumers and businesses confused and exposed. A national standard is long overdue, and the American Data Privacy and Protection Act (ADPPA) seeks to provide exactly that.
1. Uniformity and Certainty
The ADPPA is structured to preempt most state laws and establish a single, consistent federal baseline. This clarity benefits both businesses (which now have one clear framework for compliance) and consumers (who are assured of uniform protections regardless of where they live). Overviews of the Act show it aims to provide a unified standard. (Usercentrics CMP) By creating consistency across the nation, ADPPA removes loopholes and ambiguities that have allowed companies to exploit data with little accountability.
2. Foundational Consumer Rights and Data Minimization
At its core, the ADPPA enshrines the principle of data minimisation. Companies may only collect data that is strictly necessary to deliver a requested product or service, reversing the current model where businesses are incentivised to hoard information. Overviews of enforcement frameworks mention such provisions. (shardsecure.com) Americans gain concrete rights, including the ability to access, correct, and delete personal data, opt out of targeted advertising, and prevent third‑party data transfers. Sensitive information—such as geolocation or health records—receives heightened protection, reducing the risk of exploitation or misuse.
3. Fighting Algorithmic Discrimination and Enforcement
The ADPPA extends civil‑rights protections into the digital realm. Companies are required to conduct algorithmic impact assessments (AIAs) to prevent discrimination based on race, sex, religion, or other protected characteristics in critical areas such as employment, credit, and housing. Legal summaries confirm that large data holders must conduct such assessments and document efforts to mitigate potential harms. (EPIC) Enforcement is robust, with the Federal Trade Commission (FTC), state attorneys general, and private citizens all empowered to hold violators accountable—transforming digital negligence into enforceable law. (Congress.gov)
Conclusion: A New Era of Digital Accountability
Together, KOSA, KOSMA, and ADPPA mark a historic shift in U.S. technology policy. Big Tech can no longer operate as an unaccountable, self‑regulated industry prioritising profit over people. These bills are not anti‑innovation—they are pro‑humanity. They create safer digital spaces for children, impose responsibility on corporations, and restore control to individual users over their own data. By supporting these measures, lawmakers and citizens alike champion a future where technology serves human well‑being, not the other way around. In doing so, the United States can lead the world in building an online environment that is safe, equitable, and aligned with the public interest.