Robot Therapists? Colorado's New Bill Draws a Hard Line on AI in Mental Health.
Sponsors: Gretchen Rydin, Javier Mabrey, Judy Amabile, Kyle Mullica·Health & Human Services·

Illustration: Assembly Required
The Bottom Line
If you're seeing a therapist, this bill ensures a real human is the one actually treating you, not a chatbot. It bans mental health professionals from using artificial intelligence to analyze your emotions or replace their clinical judgment, while completely outlawing tech companies from marketing autonomous 'AI therapists' to the public in Colorado.
What This Bill Actually Does
House Bill 26-1195 is a preemptive strike against the automation of mental health care. At its core, the bill draws a strict legal boundary around what artificial intelligence can and cannot do in a therapy setting. For any regulated mental health professional in Colorado—psychologists, clinical social workers, marriage and family therapists, and even unlicensed psychotherapists—the bill strictly prohibits using an artificial intelligence system to directly interact with clients in any form of therapeutic communication. It also bans professionals from using AI to detect a client's emotions or mental states, or to generate treatment plans and therapeutic recommendations without explicit human review and approval.
But the legislation isn't entirely anti-technology; it's about staying in your lane. The bill explicitly allows therapists to use AI for administrative support (like managing appointment schedules, processing insurance claims, and drafting basic logistical emails) and supplementary support (like organizing external resources or analyzing anonymized data). One of the most common current uses of AI in healthcare—recording and transcribing therapy notes—is still allowed, but with major new guardrails. If a therapist wants to use an AI note-taker, they must obtain clear, explicit, and revocable written consent from the client. The bill specifically outlaws "sneaky" consent: therapists cannot bury this agreement in a broad Terms of Service document, nor can they claim consent was given just because a user interacted with a digital interface.
Finally, the bill goes after the tech sector by looping in the Colorado Consumer Protection Act. It makes it an unfair trade practice for any person or company to advertise an AI system in a way that implies its outputs are endorsed by a licensed therapist, or to claim the AI itself provides actual psychotherapy services. It also bans these platforms from making false promises about user data confidentiality. The bill does carve out sensible exemptions: university training programs, federally overseen research, religious ministry, peer support, and general wellness apps (like guided meditation or self-help tools) are exempt, provided those apps clearly disclose they are not a substitute for actual clinical care.
What It Means for You
If you are one of the hundreds of thousands of Coloradans who see a therapist, this bill is designed to protect the privacy of your deepest thoughts and the quality of your care. You go to therapy to be heard by a human, not analyzed by an algorithm. Under this bill, your therapist cannot outsource your treatment to a chatbot or use facial-recognition AI to "read" your emotions during a telehealth session. Your care remains entirely in human hands.
If your therapist uses a modern AI transcription tool to write their session notes so they can look at you instead of their laptop, you are going to see a change in the waiting room. Before they can use that technology, they will have to hand you a specific, standalone consent form explaining exactly what AI system is being used and why. Here is the part that really matters: you have the absolute right to say no. The bill explicitly states that a therapist cannot deny you services just because you refuse to let an AI record or transcribe your session. Additionally, if you've been bombarded by targeted ads for "AI therapy chatbots" on social media, you can expect those companies to dramatically change their marketing in Colorado to clearly state they are not real therapy.
Here are a few things you can do right now:
- Check your intake forms: The next time you visit your therapist or log into a telehealth portal, ask if they use AI note-takers and request to see their specific AI consent policy.
- Audit your wellness apps: If you use mental health or coaching apps, check their disclosures. Under this new law, they must clearly state they are not clinical care.
- Make your voice heard: If you have strong feelings about AI in healthcare—whether you want stricter bans or think AI could make therapy more affordable—contact the House Health & Human Services Committee members before the bill goes to a hearing.
What It Means for Your Business
If you own a private practice, run a behavioral health clinic, or operate a health-tech startup in Colorado, HB26-1195 is going to force an immediate audit of your technology stack. The state is making it crystal clear that while you can use AI to run your business, you cannot use AI to be your business. You are in the clear to use AI for administrative support—think automated scheduling reminders, billing software, and generic logistics. You can also use AI for supplementary support, such as drafting therapy notes or identifying community resources, but only if you maintain full clinical responsibility for the outputs.
If you use AI scribes (like Freed, Nabla, or similar transcription tools), your current compliance protocols are likely insufficient. The bill mandates explicit written consent that is separate from your general practice intake forms. You cannot rely on a click-wrap agreement or a broad Terms of Service update. If a client declines the AI note-taker, you must accommodate them and provide services the old-fashioned way. For tech startups and developers, this bill creates massive liability under the Colorado Consumer Protection Act. If your marketing implies your AI chatbot provides "therapy" or is equivalent to a licensed professional, you are committing an unfair trade practice. You must slap clear, unambiguous disclaimers on your products stating they are for self-help and do not treat mental health disorders.
Here is what business owners should do THIS WEEK:
- Draft a standalone AI consent form: Work with your legal counsel to create a specific, plain-English consent form for any AI transcription or recording tools you use in your clinic.
- Review your tech vendors: Ensure any AI tools you use for analyzing patient data are fully anonymized, as required by the supplementary support provisions of the bill.
- Scrub your marketing copy: If you operate a wellness, coaching, or peer-support app, ensure your website, app store description, and user interface clearly state that your service is not a substitute for clinical care.
Follow the Money
Because this bill was just introduced, the nonpartisan Legislative Council Staff has not yet released the official fiscal note detailing the exact dollar impact on the state budget. However, we can read the tea leaves based on how the bill is structured. It doesn't create a new government program or offer any tax incentives, so it won't require massive general fund appropriations. Instead, the financial impact will primarily fall on regulatory enforcement.
The Department of Regulatory Agencies (DORA), which oversees the various boards regulating psychologists, social workers, and counselors, will bear the brunt of investigating complaints against professionals who misuse AI. Furthermore, because the bill ties AI misrepresentation to the Colorado Consumer Protection Act, the Attorney General's office may need to allocate resources to investigate and penalize rogue tech companies advertising AI therapists. For local businesses and clinics, there will be minor administrative costs associated with updating compliance software and legal intake forms to meet the new strict consent requirements.
Where This Bill Stands
House Bill 26-1195 was officially introduced in the House on February 11, 2026, and has been assigned to the House Health & Human Services Committee. It boasts strong cross-chamber sponsorship from Representatives Rydin and Mabrey, alongside Senators Amabile and Mullica. This dual-chamber backing right out of the gate suggests the bill has serious legislative momentum and is a priority for lawmakers focused on healthcare policy.
Crucially, the bill contains a Safety Clause. In Colorado legislative terms, this means the General Assembly considers the issue an urgent matter of public health and safety. If passed and signed by the Governor, the law will take effect immediately, bypassing the usual 90-day waiting period after the legislative session ends. Because national anxiety over artificial intelligence in healthcare is currently at a boiling point, expect this bill to move swiftly through committee hearings. If you want to testify or submit written comments, you should start preparing now before it hits the committee calendar.
The Opportunity Signal
Where this bill creates practical upside for operators: the opening, the key constraints, and the move to make while the window is still favorable.
AI Compliance Solutions for Therapy Practices
Colorado's HB26-1195 immediately mandates rigorous, explicit written consent for mental health professionals using AI for session recording or transcription, overriding prior general intake forms. This creates an urgent demand for private practices and clinics to update their operational protocols to avoid penalties from the Department of Regulatory Agencies (DORA) and maintain client trust. Businesses can offer critical consulting services, software tools, or standardized legal templates that ensure practices comply with these new, non-negotiable consent requirements, which must be implemented swiftly due to the bill's Safety Clause for immediate effect.
- Mandatory standalone, explicit written consent for AI recording/transcription.
- Practitioners cannot deny services if clients refuse AI use.
- Non-compliance risks DORA enforcement and client complaints.
Next move: Develop a compliance toolkit, including a plain-English AI consent form template and an audit checklist, and pitch it directly to the Colorado Psychological Association and the National Association of Social Workers, Colorado Chapter, within 15 days.
Compliant AI Tool Development for Therapy Support
While direct AI therapy is banned, HB26-1195 explicitly greenlights AI for specific administrative tasks like scheduling, billing, and email management, and for supplementary support such as organizing external resources or analyzing anonymized data, provided human review is maintained. This distinction opens a niche for developers to create or adapt AI tools that enhance therapist efficiency without crossing ethical or legal boundaries. The market favors solutions that integrate seamlessly into existing practice workflows, reducing administrative burdens and allowing therapists to focus on direct client care, all while ensuring the human clinician retains full clinical responsibility for any AI-generated outputs.
- AI is permitted for administrative and supplementary, non-therapeutic tasks.
- Human oversight and full clinical responsibility for AI outputs are mandatory.
- Solutions must strictly avoid direct therapeutic communication or emotion detection.
Next move: Conduct market research on Colorado mental health practices to identify administrative pain points (e.g., scheduling, resource compilation), then develop a prototype for an AI-powered administrative assistant tool designed within HB26-1195's compliant parameters, targeting a demo presentation to a large Colorado behavioral health clinic within 30 days.
Health-Tech Regulatory Marketing & Legal Consulting
Colorado's new bill brings the advertising of AI mental health tools under the stringent Colorado Consumer Protection Act, criminalizing claims that imply AI provides actual psychotherapy or that licensed therapists endorse its outputs. This creates immediate and significant legal and marketing compliance challenges for wellness apps, coaching platforms, and AI startups operating in or targeting Colorado. Specialized legal and marketing consultants can provide critical services to audit existing marketing copy, review product disclaimers, and ensure all public-facing communications clearly differentiate from clinical care, mitigating exposure to state Attorney General enforcement and protecting brand reputation.
- Marketing AI as "therapy" or endorsed by professionals is an unfair trade practice.
- Wellness/coaching apps must prominently disclaim they are not clinical care.
- False claims about data confidentiality are strictly prohibited.
Next move: Prepare a targeted service offering for health-tech startups, focusing on Colorado Consumer Protection Act compliance for marketing and disclaimers, and present it to local startup accelerators (e.g., Techstars Boulder, Boomtown Accelerators) or venture capital firms investing in health tech within Colorado in the next 30 days.
Get the Wednesday briefing
Colorado legislature coverage, in plain language. Free.
Frequently Asked Questions
What does HB26-1195 do?
What is the current status of HB26-1195?
Who sponsors HB26-1195?
How does HB26-1195 affect Colorado businesses?
What committee is reviewing HB26-1195?
When was HB26-1195 last updated?
Related Bills
Local Governments Could Soon Rewrite the Rules for Colorado Massage Businesses
In Committee
HB26-1228Need a Therapist? Colorado is Changing the Rules to Fix the Shortage.
Passed House
HB26-1215The Two-Word Typo Fixing Colorado's Crackdown on Stolen Goods Online
In Committee
HB26-1216Fixing the Fine Print: The Bill Cleaning Up Colorado's Tax and Transit Laws
In Committee