Evolving age assurance requirements and renewed federal attention are reshaping expectations around children’s consent, product design, and youth data protection.
Harry Chambers
Regulatory Content Strategist
April 2, 2026
While children’s online experiences have continued to expand, so has regulatory scrutiny over how organizations collect, use, and protect minors’ personal data. In 2025 and early 2026, legislators and regulators have accelerated efforts to strengthen protections for children and teens, with a growing emphasis on age assurance, parental consent, and safety-by-design obligations.
In 2026 alone, the governor of South Carolina enacted the South Carolina Age Appropriate Code Design, and the governor of Alabama enacted House Bill 161 on app store providers and developers. These laws add to legislation focused on children's data protection that is already in force, or set to take effect at the end of 2026, such as Vermont's Age Appropriate Design Code on January 1, 2027.
From state-level social media regulation to app store accountability laws and renewed federal momentum behind the Kids Online Safety Act (KOSA), the compliance landscape for children’s data is becoming more complex and harder to ignore.
For example, a social media platform used by teenagers may need to disable targeted advertising by default, introduce parental supervision tools, and limit data collection for features such as location sharing or personalized recommendations.
Children’s data privacy laws in the U.S. have traditionally centered on the Children’s Online Privacy Protection Act (COPPA), which requires verifiable parental consent for online data collection from children under 13. However, recent legislative developments reflect a broader shift: expanding protections to teens, regulating platform design, and embedding duty-of-care standards into law.
States are increasingly moving beyond notice-and-consent frameworks, focusing instead on how digital products are designed, how defaults are set, and how minors are protected by default, even when parental consent is not the primary mechanism. The scope of services subject to children's data laws is increasingly broad, with services under the Vermont Age Appropriate Design Code being considered to be 'reasonably likely to be accessed by a minor' if at least 2% of users are between 2 and 17 years old.
In practice, this means a video streaming service, gaming platform, or educational app may fall under youth data rules even if it does not explicitly target children, as long as a measurable portion of its audience includes minors.
State Attorneys General in California and Connecticut have also outlined that privacy protections given to children's data will remain a regulatory focus, noting ongoing investigations into privacy and safety risks.
In February 2026, the South Carolina Age Appropriate Code Design entered into effect on the same date as its passage, imposing sweeping obligations on online services reasonably likely to be accessed by minors. The law requires covered services to exercise reasonable care in the use of minors’ personal data, limit data collection, prohibit certain targeted advertising practices, and provide parental tools and transparency measures.
Notably, the law reflects an age-appropriate design approach rather than a consent-first model, signaling a broader regulatory trend toward default protections and product-level safeguards for children and teens.
For instance, a social platform may be required to disable algorithmic content recommendations that promote compulsive use patterns for younger users or ensure privacy settings default to the most protective configuration.
Other State legislation in the US, taking initial inspiration from the UK Age Appropriate Design Code which was passed in 2020, has started to move beyond initial requirements of risk based protections given to minors' personal data. Proposed laws in State legislatures such as Arizona, New Mexico, Kentucky, and Virginia now go beyond earlier laws in:
Alabama’s newly enacted app store law is also reflective recent trends by introducing another compliance layer by shifting responsibility upstream to app stores and developers. Signed in February 2026, the law requires age category verification and, for minors, confirmation that verifiable parental consent has been obtained before app downloads or significant app changes occur.
Developers are also restricted in how age data can be used and must apply the lowest applicable age category when implementing restrictions or defaults. App store providers, meanwhile, must request and verify age categories using compliant verification methods.
A gaming developer releasing an update that introduces social chat or in-app purchases may need to verify the user’s age category and ensure parental consent is obtained before enabling those features for minors.
Alabama has joined other States such as Texas, Utah, Louisiana, and California in passing legislation that requires age verification and age gating of content accessible to minors, alongside parental disclosures by developers of apps. These laws go beyond more common consent requirements in comprehensive privacy legislation such as the California CCPA or Colorado CPA, in mandating verifiable consent based on industry standards.
At the federal level, children’s data protection remains fragmented, but pressure is mounting. In February 2026, a bipartisan coalition of 40 state Attorneys General urged Congress to pass the Senate version of the Kids Online Safety Act (KOSA), citing concerns that the House version would preempt state laws already protecting minors.
The Senate version of KOSA emphasizes duty-of-care obligations for platforms and seeks to preserve states’ ability to respond to evolving online harms affecting children’s mental health and safety. This federal-state tension highlights a key compliance challenge: organizations must track both current state laws and potential federal changes without assuming preemption will simplify requirements.
Under duty-of-care expectations, a platform hosting user-generated content may need to assess whether recommendation algorithms amplify harmful material to minors and implement safeguards to limit exposure.
This federal-state tension highlights a key compliance challenge: organizations must track both current state laws and potential federal changes without assuming preemption will simplify requirements.
Across these developments, a few consistent themes are emerging:
For organizations that design, market, or distribute digital services, children’s data compliance is no longer a niche issue. The combination of state-level enforcement, expanding age scopes, and renewed federal attention means companies must:
As lawmakers continue to experiment with new models for protecting minors online, organizations should expect continued regulatory evolution rather than consolidation.
OneTrust helps organizations operationalize parental consent by enabling parent–child identity relationships within Collection Points. By configuring OneTrust Hosted Web Forms or API-based Collection Points to capture a parent identifier, organizations can link parent and child identities into a single data subject group and manage those relationships centrally. This setup ensures consent is correctly attributed and that Double OptIn and preference communications are routed appropriately, typically to the parent, based on the identifiers provided, supporting consistent and auditable consent management for children’s data.
Beyond consent collection, OneTrust supports identity verification through integrations with external identity providers using OpenID Connect (OIDC) and the OneTrust ID Verification API. These options allow organizations to verify identities using approaches such as score-based authentication, one-time passcodes, and knowledge-based authentication, while maintaining control over verification workflows and limiting unnecessary exposure of personal data. Together, these capabilities help marketers and digital service providers collect children’s data responsibly, verify identities when needed, and manage parental consent in line with evolving regulatory expectations.
Children’s data laws are entering a new phase defined less by static consent checkboxes and more by ongoing responsibility for how digital experiences impact minors. With states like South Carolina and Alabama setting benchmarks and federal debates over KOSA intensifying, organizations that proactively align privacy, product, and governance strategies will be better positioned to adapt as expectations continue to rise.
Verifiable parental consent requires organizations to confirm that a child’s parent or guardian has authorized the collection and processing of personal data before it occurs. Accepted methods may include credit card verification, government ID checks, or knowledge-based authentication processes.
No. Many laws apply to services that are reasonably likely to be accessed by minors. This means platforms such as gaming services, streaming platforms, educational tools, and social networks may fall within scope even if they do not explicitly target children.
Recent legislation reflects concerns that consent alone does not sufficiently protect minors. Laws increasingly require services to limit data collection, disable certain advertising practices, and design experiences that reduce risks to children’s safety and wellbeing.
Organizations should assess how minors interact with their digital services, implement age assurance or age-gating mechanisms where appropriate, and ensure parental consent workflows and product design choices align with emerging duty-of-care expectations.