New Laws Restrict AI for Minors, Add Privacy Rights
Summary
Washington state has enacted a new law, effective January 1, 2027, that imposes restrictions on the use of AI for minors and introduces new privacy rights. The law defines 'companion chatbots' broadly and may impact companies using conversational AI for customer engagement, requiring compliance with governance mechanisms, design elements, and potentially facing private rights of action.
What changed
Washington state has passed HB 2225, a new law set to take effect on January 1, 2027, which introduces significant regulations for AI technologies, particularly those involving minors and conversational AI. The law broadly defines 'companion chatbots' to include AI with 'anthropomorphic features' or 'adaptive, human-like responses' designed to 'sustain a relationship across multiple interactions.' Companies utilizing such technologies for brand loyalty or engagement may fall under its purview, necessitating adherence to new governance mechanisms, mandatory design elements, age limits, and reporting requirements. Notably, the law includes a private right of action, similar to the My Health My Data Act, allowing individuals to sue for violations.
This legislation signals a growing trend of rapid legislative responses to AI technologies at both state and federal levels in the US. Companies engaging with consumers through AI, especially chatbots and voice agents, must prepare for compliance with these new requirements. The broad definitions used in the law suggest that many AI applications, even those not explicitly designed as 'companion bots,' could be impacted. The inclusion of private rights of action, as seen in Washington and potentially Oregon (SB 1546), underscores the increased legal risk for non-compliance, making it crucial for businesses to review their AI deployments and privacy practices to align with these evolving regulations.
What to do next
- Review AI chatbot and voice agent functionalities for 'companion chatbot' definitions.
- Assess compliance with new governance, design, age limit, and reporting requirements.
- Prepare for potential private rights of action and associated legal risks.
Penalties
Private right of action with statutory damages (Oregon SB 1546 mentioned as an example).
Source document (simplified)
OPINION Published
27 March 2026
Contributors:
Cobun Zweifel-Keegan
CIPP/US, CIPM
Managing Director, D.C.
IAPP
There are two tidal waves on a collision course in 2026.
One is the technological wave of conversational artificial intelligence that is rapidly changing how consumers interact with companies. As text-based chatbots and voice agents mature with simulated empathy and adaptive personalities, it is only a matter of time before human-like machine interactions become ubiquitous.
Though consumers are probably not particularly thrilled about their call center interactions shifting to bots that can pass the Turing test, usage surveys appear to show fewer qualms about other contexts. Confiding in chatbots is already a surprisingly popular pastime among users of all ages, beyond the high levels of teen adoption that sparked headlines last year. For example, a recent study by KFF shows that 32% of adults are turning to generative AI tools for health information and advice. In the dating context, Match.com just released its own report claiming that 1 in 4 singles in America are "using AI to filter matches, write messages, or reflect on dating habits."
Meanwhile, a rapid legislative reaction the likes of which we have rarely seen in tech policy is swelling across the U.S. Policymakers and their constituents have major concerns about the mental health and safety risks posed by chatbots — particularly companion bots like character.AI, but also all-purpose services like ChatGPT, which are often used in companion-like ways. This policy response at the state and federal levels now includes a set of stringent and increasingly diverse bills that are poised to impose governance mechanisms, mandatory design elements, age limits, reporting requirements and even private rights of action for any service that meets what are frequently broad definitions of a "companion chatbot."
Washington's HB 2225 is a prime example. As it happens, it is past time to take a look at this bill because it was just signed into law by Gov. Bob Ferguson, D-Wash., this week with an effective date of 1 Jan. 2027. The reality is that every company will soon engage consumers via a chatbot of one kind or another, and as these bots become increasingly conversational with "anthropomorphic features" or "adaptive, human-like responses" designed to "sustain a relationship across multiple interactions," they will need to meet the requirements of Washington's new law.
If a company seeks to build brand loyalty or foster engagement through its chatbot or voice agent integrations, it may easily stray into the regulated "companion" territory. This and other broad definitions mean these new laws are likely set for a collision course with the expanding use of similar technologies across the economy.
Also giving state session watchers pause, Washington's new law includes a private right of action in the same manner as the earlier My Health My Data Act. Oregon's SB 1546, which is also set to be signed into law soon, goes even farther with statutory damages for its private right of action.
Synthetic relationships by design
The range of legislative solutions to the risks posed by chatbots echo many prior waves of privacy and digital responsibility laws. In fact, some of the biggest historical battles in tech policy are all converging for rematches in the chatbot context. There are age verification requirements, data collection restrictions like in Maryland's HB 952, content moderation requirements with the added gloss of a duty to report certain high-risk outcomes.
Chatbot laws are a unique new battleground not just because they sit at the intersection of so many issues, but also because they are focused on what is becoming such an intimate part of consumers' lives. In many ways, these bills seek to regulate loneliness, to determine the contours of a healthy synthetic relationship.
Washington's law, for example, restricts "manipulative engagement techniques" for minors, including, as the Troutman Pepper Locke summarizes in their analysis, "mimicking romantic partnership or building romantic bonds, or soliciting gift-giving, in-app purchases, or other expenditures framed as necessary to maintain the relationship with the AI companion."
A rapidly emerging patchwork
California and New York passed their own laws last year and are already considering more. Not to be outdone, the legislature in Washington and Oregon's conservative neighbor to the east, Idaho, passed its own unique chatbot bill. And behind Oregon and Idaho, a slew of other states are lined up with bills that have already passed at least one chamber, including Georgia, Hawaii, Iowa, Maryland and Pennsylvania.
Federal proposals include chatbot restrictions too. The KIDS Act package, recently passed out of the U.S. House Energy and Commerce Committee, includes the bipartisan Safeguarding Adolescents from Exploitative Bots Act. This bill would create a federal standard echoing many of the most common ideas in the states, including transparency requirements, mandatory breaks after three hours of uninterrupted use, and mental health safeguards.
How can you keep up with all of this? For starters, the Future of Privacy Forum's Justine Gluck and Rafal Fryc have been steadfastly following every twist and turn in the chatbot legislative saga. They just released a very helpful tracker.
Please send feedback, updates and simulated empathy to cobun@iapp.org.
This article originally appeared in The Daily Dashboard and U.S. Privacy Digest, free weekly IAPP newsletters. Subscriptions to this and other IAPP newsletters can be found here .
This content is eligible for Continuing Professional Education credits. Please self-submit according to CPE policy guidelines.
Contributors:
Cobun Zweifel-Keegan
CIPP/US, CIPM
Managing Director, D.C.
IAPP
Tags:
AI and machine learning Law and regulation
Related Stories
### IAPP updates its US state breach notice resource 26 March 2026
ANALYSIS
### Thought for the week: President Trump's Cyber Strategy for America, a baseline quietly reset 23 March 2026
OPINION
### A view from DC: The first lady takes online safety to the global stage 20 March 2026
OPINION
### When machines judge without knowing: AI, augmentation and the limits of automated cybersecurity decisions 19 March 2026
OPINION
Named provisions
Related changes
Source
Classification
Who this affects
Taxonomy
Browse Categories
Get Data Privacy & Cybersecurity alerts
Weekly digest. AI-summarized, no noise.
Free. Unsubscribe anytime.
Get alerts for this source
We'll email you when IAPP Privacy News publishes new changes.