Policymakers all over the world try to move laws that addresses information privateness points impacting youngsters and teenagers. To keep away from expensive fines and litigation, Ceren Canal Aruoba, co-leader of BRG’s client safety, product legal responsibility and atmosphere observe, says C-suite executives and company integrity professionals would do effectively to know the fast-changing regulatory panorama — and the right way to proactively mitigate threat.
Final August, a bipartisan coalition of 44 US state attorneys normal despatched a proper letter to main AI corporations expressing issues concerning the security of youngsters interacting with AI chatbots. This letter was only a trace of what’s to come back.
As AI use intensifies, so will regulatory scrutiny of information privateness points impacting youngsters and teenagers, be it associated to dangerous content material, manipulative design, noncompliant surveillance, automated profiling or any variety of different violations. A wave of expensive penalties, settlements and reputational damages is already rising, together with final yr’s settlement by Disney surrounding allegations of illegal assortment of youngsters’s private information.
However the US just isn’t alone — not by a longshot, which may serve to additional complicate actuality for company compliance, information privateness and cybersecurity professionals.
International developments
Many jurisdictions all over the world have not too long ago handed — or try to move — laws to deal with the information privateness points concentrating on youngsters’ and teenagers. Most of those efforts are supposed to create guardrails and sometimes restrict corporations’ talents to promote to folks beneath 18, in addition to gather and profit from their private information.
Broadly talking, new insurance policies prolong past information privateness into product governance and algorithmic accountability (e.g., algorithmic personalization and engagement options are banned from getting used for minors in an rising variety of jurisdictions). Enforcement is selecting up, too, and websites, apps and platforms might be susceptible even when they aren’t particularly designed or focused at kids.
Just a few key regulatory developments to maintain top-of-mind:
US
Actions are going down on the state and federal ranges, together with vital amendments to the Kids’s On-line Privateness Safety Act (COPPA) that went into impact in June 2025 and for which operators have till April 22, 2026, to conform. Although COPPA applies to kids beneath 13, there may be pending federal laws to broaden its protections to these beneath 17. This laws would additionally ban focused promoting to minors and require corporations to create an “eraser button” for customers to delete their very own information. Quite a few states have handed their very own laws as effectively, together with: age-appropriate design code legal guidelines in Nebraska and Vermont; social media and app retailer legal guidelines that require age-verification and parental consent in Utah, Texas, Louisiana and Nebraska; and youth-related updates to the California Privateness Rights Act.
EU
Age assurance tips adopted by the European Knowledge Safety Board in February 2025 will direct enforcement of age-gating strategies throughout the EU. Relatedly, the EU could revise the GDPR as a part of a broader digital omnibus effort. Whereas the modifications wouldn’t be written particularly for youngsters, they might nonetheless materially change how corporations gather and use kids’s information.
UK
A number of overlapping legal guidelines now govern kids’s on-line experiences. Collectively, they create stricter design, data-use and security obligations for any digital service kids would possibly use, backed by energetic enforcement by way of 2026. As an illustration: The age-appropriate design code is a UK framework that requires digital companies more likely to be accessed by kids to reduce information assortment, default to excessive privateness settings and keep away from nudging or darkish patterns. Here’s a abstract of different measures:
- In April 2024, the Info Commissioner’s Workplace launched the kids’s code technique, which seeks to make sure that social media and video sharing platforms adjust to information safety legislation and conform with the requirements of the kids’s code.
- The Knowledge (Use and Entry) Act 2025, which will likely be phased in by June 2026, makes modifications to information consent and lawful processing and expands data-sharing permissions; whereas not child-specific, it interacts with the kids’s code and will decrease friction for information use, making kids’s protections extra depending on design compliance quite than consent alone.
- The 2023 On-line Security Act requires platforms to conduct formal kids’s threat assessments, determine harms (content material, algorithms, information use) and implement mitigation measures. Enforcement is dealt with by Ofcom and will result in vital fines and repair restrictions.
Australia
A coordinated set of legal guidelines would require platforms to confirm or estimate customers’ ages, limit underage entry to social media and redesign information practices to raised defend kids. These embody:
Important enterprise implications
These international developments sign a sea change in how regulators are treating information privateness points affecting youngsters and teenagers. Expensive fines by regulators will inevitably comply with for many who don’t comply with the brand new laws, probably paired up with obligatory product redesigns and/or ongoing oversight.
In Australia, for instance, eSafety code violations may result in fines of as much as $50 million, whereas FTC settlements for COPPA violations may be within the tens of hundreds of thousands of {dollars} (Epic Video games, maker of the favored “Fortnite” online game, settled with the company over such violations for $275 million in 2022). Within the US, large-scale class actions may come within the wake of enforcement actions.
For executives and company compliance professionals, there are a number of classes to be discovered from current enforcement traits, notably within the U.S.:
- Aggressive twin enforcement: Federal businesses (e.g., the FTC and DOJ) and personal litigants are concurrently pursuing instances, overlaying each regulatory and financial penalties.
- COPPA vs. US state legislation limits: Personal US state-law claims involving under-13 minors typically face preemption by COPPA. Nonetheless, claims for teenagers 13 to 17 are gaining traction beneath state and client statutes.
- Algorithmic personalization beneath scrutiny: Lawsuits focus not simply on information assortment however on how information is utilized in suggestion engines — driving enforcement in platforms’ analytics and engagement mechanisms.
- Duty in EdTech: Stakeholders ought to deal with faculty information assortment as requiring parental consent beneath COPPA, even when instruments are deployed at scale beneath faculty contracts.
- International momentum: Regulators throughout jurisdictions are converging on comparable points — age assurance, algorithmic profiling, engagement-driving options and dark-pattern design.
Policymakers all over the world try to move laws that addresses information privateness points impacting youngsters and teenagers. To keep away from expensive fines and litigation, Ceren Canal Aruoba, co-leader of BRG’s client safety, product legal responsibility and atmosphere observe, says C-suite executives and company integrity professionals would do effectively to know the fast-changing regulatory panorama — and the right way to proactively mitigate threat.
Final August, a bipartisan coalition of 44 US state attorneys normal despatched a proper letter to main AI corporations expressing issues concerning the security of youngsters interacting with AI chatbots. This letter was only a trace of what’s to come back.
As AI use intensifies, so will regulatory scrutiny of information privateness points impacting youngsters and teenagers, be it associated to dangerous content material, manipulative design, noncompliant surveillance, automated profiling or any variety of different violations. A wave of expensive penalties, settlements and reputational damages is already rising, together with final yr’s settlement by Disney surrounding allegations of illegal assortment of youngsters’s private information.
However the US just isn’t alone — not by a longshot, which may serve to additional complicate actuality for company compliance, information privateness and cybersecurity professionals.
International developments
Many jurisdictions all over the world have not too long ago handed — or try to move — laws to deal with the information privateness points concentrating on youngsters’ and teenagers. Most of those efforts are supposed to create guardrails and sometimes restrict corporations’ talents to promote to folks beneath 18, in addition to gather and profit from their private information.
Broadly talking, new insurance policies prolong past information privateness into product governance and algorithmic accountability (e.g., algorithmic personalization and engagement options are banned from getting used for minors in an rising variety of jurisdictions). Enforcement is selecting up, too, and websites, apps and platforms might be susceptible even when they aren’t particularly designed or focused at kids.
Just a few key regulatory developments to maintain top-of-mind:
US
Actions are going down on the state and federal ranges, together with vital amendments to the Kids’s On-line Privateness Safety Act (COPPA) that went into impact in June 2025 and for which operators have till April 22, 2026, to conform. Although COPPA applies to kids beneath 13, there may be pending federal laws to broaden its protections to these beneath 17. This laws would additionally ban focused promoting to minors and require corporations to create an “eraser button” for customers to delete their very own information. Quite a few states have handed their very own laws as effectively, together with: age-appropriate design code legal guidelines in Nebraska and Vermont; social media and app retailer legal guidelines that require age-verification and parental consent in Utah, Texas, Louisiana and Nebraska; and youth-related updates to the California Privateness Rights Act.
EU
Age assurance tips adopted by the European Knowledge Safety Board in February 2025 will direct enforcement of age-gating strategies throughout the EU. Relatedly, the EU could revise the GDPR as a part of a broader digital omnibus effort. Whereas the modifications wouldn’t be written particularly for youngsters, they might nonetheless materially change how corporations gather and use kids’s information.
UK
A number of overlapping legal guidelines now govern kids’s on-line experiences. Collectively, they create stricter design, data-use and security obligations for any digital service kids would possibly use, backed by energetic enforcement by way of 2026. As an illustration: The age-appropriate design code is a UK framework that requires digital companies more likely to be accessed by kids to reduce information assortment, default to excessive privateness settings and keep away from nudging or darkish patterns. Here’s a abstract of different measures:
- In April 2024, the Info Commissioner’s Workplace launched the kids’s code technique, which seeks to make sure that social media and video sharing platforms adjust to information safety legislation and conform with the requirements of the kids’s code.
- The Knowledge (Use and Entry) Act 2025, which will likely be phased in by June 2026, makes modifications to information consent and lawful processing and expands data-sharing permissions; whereas not child-specific, it interacts with the kids’s code and will decrease friction for information use, making kids’s protections extra depending on design compliance quite than consent alone.
- The 2023 On-line Security Act requires platforms to conduct formal kids’s threat assessments, determine harms (content material, algorithms, information use) and implement mitigation measures. Enforcement is dealt with by Ofcom and will result in vital fines and repair restrictions.
Australia
A coordinated set of legal guidelines would require platforms to confirm or estimate customers’ ages, limit underage entry to social media and redesign information practices to raised defend kids. These embody:
Important enterprise implications
These international developments sign a sea change in how regulators are treating information privateness points affecting youngsters and teenagers. Expensive fines by regulators will inevitably comply with for many who don’t comply with the brand new laws, probably paired up with obligatory product redesigns and/or ongoing oversight.
In Australia, for instance, eSafety code violations may result in fines of as much as $50 million, whereas FTC settlements for COPPA violations may be within the tens of hundreds of thousands of {dollars} (Epic Video games, maker of the favored “Fortnite” online game, settled with the company over such violations for $275 million in 2022). Within the US, large-scale class actions may come within the wake of enforcement actions.
For executives and company compliance professionals, there are a number of classes to be discovered from current enforcement traits, notably within the U.S.:
- Aggressive twin enforcement: Federal businesses (e.g., the FTC and DOJ) and personal litigants are concurrently pursuing instances, overlaying each regulatory and financial penalties.
- COPPA vs. US state legislation limits: Personal US state-law claims involving under-13 minors typically face preemption by COPPA. Nonetheless, claims for teenagers 13 to 17 are gaining traction beneath state and client statutes.
- Algorithmic personalization beneath scrutiny: Lawsuits focus not simply on information assortment however on how information is utilized in suggestion engines — driving enforcement in platforms’ analytics and engagement mechanisms.
- Duty in EdTech: Stakeholders ought to deal with faculty information assortment as requiring parental consent beneath COPPA, even when instruments are deployed at scale beneath faculty contracts.
- International momentum: Regulators throughout jurisdictions are converging on comparable points — age assurance, algorithmic profiling, engagement-driving options and dark-pattern design.


















