Sammy’s Law

Introduced on 3/20/26

Introduced in Senate Text

Overview

This bill establishes a mandatory framework requiring large social media platform providers to develop and maintain real-time application programming interfaces (APIs) that enable third-party safety software providers to access and manage children's social media accounts. The legislation aims to empower parents and children with enhanced control over online interactions by creating a technical infrastructure that allows delegation of account management authority to specialized safety software providers. The bill represents a regulatory intervention into social media platform architecture, mandating interoperability between platforms and third-party safety tools to address concerns about child safety in digital environments. By requiring platforms to expose their functionality through standardized APIs, the legislation seeks to create a competitive market for parental control and child safety solutions while maintaining the child's ability to use social media platforms under supervised conditions.

Core Provisions

The bill imposes three primary obligations on large social media platform providers. First, these providers must create and maintain a set of real-time APIs specifically designed for third-party safety software integration. Second, the APIs must enable children or their parents to delegate permission to third-party safety software providers, granting these providers the technical capability to manage the child's online interactions, content exposure, and account settings. Third, the delegation mechanism must provide third-party providers with the same level of control and functionality that is natively available to the child user on the platform. The bill does not specify technical standards for API design, security protocols, or data handling requirements, leaving these implementation details undefined. No timeline for compliance is established, nor are there provisions for phased implementation or grace periods. The legislation lacks definitions for critical terms including what constitutes a 'large' social media platform provider, the scope of 'third-party safety software providers,' or the age range encompassed by 'children.'

Key Points

  • Mandatory creation and maintenance of real-time APIs by large social media platform providers
  • APIs must enable delegation of account management authority from children or parents to third-party safety software providers
  • Third-party providers must receive equivalent control capabilities as available to the child user
  • No specified technical standards, security requirements, or API design specifications
  • No implementation timeline or compliance deadlines established
  • Critical definitions including 'large social media platform provider' and 'children' remain undefined

Implementation

The bill contains no provisions identifying responsible federal agencies for oversight, enforcement, or rulemaking authority. No administrative agency is designated to develop implementing regulations, establish technical standards for API functionality, or create certification processes for third-party safety software providers. The legislation lacks any funding authorization or appropriation to support government oversight activities or to assist platforms with compliance costs. There are no reporting requirements imposed on social media platforms regarding API usage, security incidents, or compliance metrics. The bill does not establish a registration or approval process for third-party safety software providers, creating uncertainty about which entities qualify for API access. Enforcement mechanisms, including penalties for non-compliance, investigation procedures, or complaint processes, are entirely absent from the legislative text. The absence of these implementation details suggests the bill may require substantial regulatory development before becoming operational.

Impact

The primary beneficiaries of this legislation are parents seeking enhanced control over their children's social media usage and children who may benefit from improved safety protections in online environments. Third-party safety software providers would gain mandated access to major social media platforms, potentially creating new market opportunities for parental control and content filtering services. Large social media platform providers face significant compliance burdens including the technical costs of developing and maintaining real-time APIs, ongoing support obligations for third-party integrations, and potential liability exposure for security vulnerabilities or data breaches arising from API access. The bill imposes no cost estimates or economic impact assessments, making it difficult to quantify the financial burden on platforms or the broader economic effects. Administrative burden extends beyond initial API development to include continuous maintenance, version updates, security monitoring, and technical support for third-party providers. The legislation contains no sunset provisions, making the requirements permanent once enacted. Expected outcomes include increased parental oversight capabilities, potential reduction in children's exposure to harmful content, and greater market competition in child safety software, though these outcomes depend entirely on effective implementation and adoption.

Legal Framework

The bill's constitutional basis likely rests on Congress's Commerce Clause authority to regulate interstate commerce, given that social media platforms operate across state lines and constitute significant commercial enterprises. The legislation may also implicate First Amendment considerations, as mandatory API access could affect platforms' editorial discretion and content curation practices, potentially raising constitutional challenges regarding compelled speech or association. The bill does not reference existing statutory frameworks such as the Children's Online Privacy Protection Act (COPPA) or Section 230 of the Communications Decency Act, creating uncertainty about how this legislation interacts with established legal protections and obligations. No regulatory authority is delegated to federal agencies, suggesting the bill would be self-executing upon enactment, though this approach creates significant implementation challenges. The legislation does not address preemption of state or local laws, leaving open the possibility of conflicting state-level social media regulations or child safety requirements. Judicial review provisions are absent, providing no guidance on standing, venue, or standards of review for challenges to platform compliance or agency actions. The lack of severability clauses means that if any provision is found unconstitutional, the entire statute could be invalidated.

Critical Issues

The bill faces substantial constitutional challenges, particularly regarding First Amendment protections for social media platforms' editorial discretion and potential compelled speech issues arising from mandatory third-party access to platform functionality. The undefined term 'large social media platform provider' creates enforcement uncertainty and potential equal protection concerns if applied inconsistently. Implementation challenges are severe, as the bill provides no technical standards, security requirements, or authentication protocols for API access, potentially exposing platforms and users to data breaches, unauthorized access, or malicious third-party software. The absence of vetting or certification requirements for third-party safety software providers raises significant privacy and security concerns, as unqualified or malicious actors could potentially gain access to children's accounts and personal information. Cost implications are substantial but unquantified, including development costs for API infrastructure, ongoing maintenance expenses, legal compliance costs, and potential liability exposure for platforms. The bill creates unintended consequences including potential fragmentation of user experience, increased complexity in platform operations, and possible reduction in platform innovation due to mandatory interoperability requirements. Opposition arguments likely include concerns about government overreach into private platform operations, inadequate consideration of technical feasibility, insufficient privacy protections, and the risk of creating new vulnerabilities in children's online safety rather than enhancing it. The legislation's failure to define 'children' creates ambiguity about the age range covered, potentially extending requirements to teenagers who may not require or desire parental oversight.

Key Points

  • First Amendment challenges regarding compelled platform functionality and editorial discretion
  • Undefined 'large social media platform provider' creates enforcement and equal protection issues
  • No technical standards or security protocols specified for API implementation
  • Absence of vetting requirements for third-party safety software providers poses security risks
  • Unquantified compliance costs for platform providers
  • Potential data breach vulnerabilities from mandatory third-party access
  • No privacy protections or data handling requirements for third-party providers
  • Lack of enforcement mechanisms or penalties for non-compliance
  • Undefined 'children' term creates ambiguity about age range and applicability

From the Legislature

A bill to require large social media platform providers to create, maintain, and make available to third-party safety software providers a set of real-time application programming interfaces, through which a child or a parent may delegate permission to a third-party safety software provider to manage the online interactions, content, and account settings of such child on the large scale social media platform in the same manner as is available to the child, and for other purposes.

Sponsors

D
1
2
RR
Democratic CaucusRepublican Caucus