Changeflow GovPing Legislation Colorado Bill on AI in Health Care Utilization ...
Priority review Rule Added Draft

Colorado Bill on AI in Health Care Utilization Review

Favicon for leg.colorado.gov CO Legislature Bill Search
Published January 1st, 2026
Detected March 5th, 2026
Email

Summary

Colorado bill HB26-1139 proposes new requirements for the use of artificial intelligence in health care utilization review and mental health companion chatbots. The bill mandates that AI systems used for utilization review must consider individual patient circumstances and that denials be reviewed by a licensed clinician. It also defines and regulates mental health companion chatbots, prohibiting unauthorized practice of psychotherapy.

What changed

Colorado bill HB26-1139 introduces significant regulations for the use of AI in healthcare, specifically targeting utilization review and mental health companion chatbots. For utilization review, entities like insurance carriers and managed care organizations must ensure AI systems do not base decisions solely on group data and must consider individual patient clinical circumstances. Denials of coverage must be reviewed by a licensed clinician. The bill also defines 'mental health companion chatbots' and sets strict requirements, including clear disclosure that the AI is not human, protocols for suicidal ideation, and restrictions on data sharing. It prohibits AI systems from engaging in the unauthorized practice of psychotherapy.

Entities utilizing AI for utilization review in Colorado will need to update their processes to comply with these new requirements, particularly regarding individual patient data and clinician oversight for denials. Healthcare providers and insurers must ensure their AI systems meet these standards by the bill's effective date. The regulations around mental health companion chatbots will impact developers and providers of such services, requiring them to implement specific disclosures, safety protocols, and data privacy measures. Failure to comply could lead to penalties related to the unauthorized practice of psychotherapy and other violations.

What to do next

  1. Review bill HB26-1139 for applicability to AI systems used in utilization review and mental health services.
  2. Update AI utilization review processes to ensure individual patient circumstances are considered and denials are reviewed by licensed clinicians.
  3. Implement required disclosures, safety protocols, and data privacy measures for mental health companion chatbots if applicable.

Penalties

Provisions related to the unauthorized practice of psychotherapy and other violations.

Source document (simplified)

HB26-1139

Use of Artificial Intelligence in Health Care

| Type | Bill |
| --- | --- |
| Session | 2026 Regular Session |
| Subjects | Business & Economic Development Health Care & Health Insurance Professions & Occupations |
Concerning the use of artificial intelligence in health care.

Recent Bill (PDF) Recent Fiscal Note (PDF) Bill Summary:

Section 2 of the bill requires entities that use an artificial intelligence system or algorithm (AI system) for the purpose of conducting utilization review of health-care services, including health insurance carriers, pharmacy benefit managers, private utilization review organizations, behavioral health administrative services organizations, and managed care entities, to ensure that the AI system complies with certain requirements specified in the bill when determining coverage for services. Specifically, the AI system used must:

  • Not base its determination solely on group data; and
  • Make determinations based on medical or clinical history, the patient's individual clinical circumstances, and other relevant factors specified in the bill, with denial of coverage reviewed by a licensed clinician or physician. The AI system may be used to assist in utilization review, including expedited approvals. A denial or delay of coverage for a service based in whole or in part on medical necessity must be reviewed by a licensed clinician or physician who is competent to evaluate the specific clinical issues.

Section 3 defines a "mental health companion chatbot", in part, as an AI system that:

  • Uses generative artificial intelligence to provide adaptive, personalized, and emotionally resonant responses to sustain a one-on-one relationship with a user;
  • Engages in interactive conversations similar to those an individual would have with a licensed mental health professional; and
  • Is represented by the AI systems provider as, or that a reasonable person believes to be, capable of providing mental health therapy or of helping to manage or treat mental health conditions.
    Sections 2, 5, 6, and 7:

  • Declare that an AI systems provider engages in the unauthorized practice of psychotherapy if the AI system used:

  • Represents, states, or indicates, explicitly or implicitly, that the AI system is a human mental health provider or is authorized to engage in the practice of psychotherapy;

  • Uses prohibited titles, abbreviations or descriptions of professions, credentials, or services that only a mental health professional authorized to provide psychotherapy in the state (regulated professional) may use;

  • Delivers psychotherapy services that would be considered the practice of psychotherapy without oversight by an individual who is a regulated professional; or

  • Is a mental health companion chatbot and: Fails to provide clear and conspicuous notice to the user that the AI system is not a human and is not authorized to provide psychotherapy, therapy, or counseling or to manage or treat mental health conditions; fails to disclose that the AI system is artificial intelligence when asked; fails to implement a protocol to address suicidal ideation or self-harm expressed by users, including referring users to a suicide hotline or crisis text line; or sells, shares, or discloses identifiable mental health data or conditions the use of the mental health companion chatbot on a user agreeing to those practices;

  • Allow for the use of an AI system to provide general information, support, or education, without representing that the AI system is a regulated professional;

  • Exempt from the bill the development, testing, or evaluation of an AI system conducted for the purpose of research by an institutional review board; and

  • Prohibit a regulated professional from billing a public or private payer for psychotherapy services that are provided directly to a client and that are conducted by an AI system or for supervision of candidates or professional consultations that are provided by an AI system without human oversight.
    Section 4 requires a regulated professional to disclose to a client the purposes for which the regulated professional uses AI systems or therapeutic or diagnostic devices that include AI systems in their practice and when those AI systems or devices are used, the right of a client to consent to a disclosure of confidential communications, and other disclosures. Sections 2 and 7 prohibit a health insurance carrier and a payer of services under the "Colorado Medical Assistance Act" and the "Children's Basic Health Plan Act" from paying for psychotherapy services that are provided directly to a client and that are conducted by an AI system.
    (Note: This summary applies to this bill as introduced.)

Prime Sponsors


Representative

Junie Joseph
Representative

Sheila Lieder

Committees

House

Health & Human Services

Share:

If you require reasonable accessibility accommodation to access this content, please email accessibility@coleg.gov.


Status

Under Consideration

Introduced

Under Consideration


Upcoming Schedule

1 meeting

Wed

Mar 4

House Health & Human Services

1:30 PM HCR 0112


Related Documents & Information

| Date | Version | Documents |
| --- | --- | --- |
| 02/04/2026 | Introduced | PDF |

| Date | Version | Documents |
| --- | --- | --- |
| 03/02/2026 | FN1 | PDF |

| Activity | Vote | Documents |
| --- | --- | --- |
| Refer House Bill 26-1139, as amended, to the Committee of the Whole. | The motion passed on a vote of 8-5. | Vote summary |

| Date | Amendment Number | Committee/ Floor Hearing | Status | Documents |
| --- | --- | --- | --- | --- |
| 03/04/2026 | L.002 | HOU Health & Human Services | Passed [] | PDF |
| 03/04/2026 | L.001 | HOU Health & Human Services | Passed [
] | PDF |
* Amendments passed in committee are not incorporated into the measure unless adopted by the full House or Senate.

** The status of Second Reading amendments may be subsequently affected by the adoption of an amendment to the Committee of the Whole Report. Refer to the House or Senate Journal for additional information.

| Date | Location | Action |
| --- | --- | --- |
| 03/04/2026 | House | House Committee on Health & Human Services Refer Amended to House Committee of the Whole |
| 02/04/2026 | House | Introduced In House - Assigned to Health & Human Services |
Prime Sponsor

Rep. J. Joseph | Rep. S. Lieder

Sponsor

(None) Co-Sponsor

(None)

Quick Links

Classification

Agency
Various Federal Agencies
Published
January 1st, 2026
Instrument
Rule
Legal weight
Binding
Stage
Draft
Change scope
Substantive

Who this affects

Applies to
Insurers Healthcare providers
Geographic scope
State (Colorado)

Taxonomy

Primary area
Healthcare
Operational domain
Compliance
Topics
Artificial Intelligence Insurance Mental Health

Get Legislation alerts

Weekly digest. AI-summarized, no noise.

Free. Unsubscribe anytime.

Get alerts for this source

We'll email you when CO Legislature Bill Search publishes new changes.

Free. Unsubscribe anytime.