top of page

Should AI Be Your Therapist? A Landmark Bill Just Passed Saying No

  • Writer: Mind Share Partners
    Mind Share Partners
  • Jul 29
  • 2 min read
ree

AI chatbots are seemingly everywhere. Many claim to address the cost and accessibility to care and stigma. But until now, they’ve operated in a largely unregulated space. 


Now, states are stepping in to fill the regulatory gap after the U.S. Senate removed a provision that would have blocked them from regulating AI. Illinois is poised as the first state in the U.S. to explicitly regulate AI therapy chatbots with The Wellness and Oversight for Psychological Resources Act (House Bill 1806) being passed in late May—now awaiting the Governor's signature. 


Illinois’s landmark bill could mark a turning point in regulating AI in mental health care. The bill states: “a licensed professional may not use an artificial intelligence system in therapy or psychotherapy services to make independent therapeutic decisions, directly interact with clients in any form of therapeutic communication, or generate therapeutic recommendations or treatment plans without the review and approval by a licensed professional.”


This comes following growing concerns voiced by leaders in technology, medicine, and mental health. “It’s like a field of mushrooms, some of which are going to be poisonous and some nutritious,” shared Psychiatrist Dr. Andrew Clark with Time. Dr. Clark recently made headlines after posing as a teen patient to test popular mental health bots like Character.AI, Nomi, and Replika.


He found troubling and sometimes dangerous responses: promises that the bot was “a flesh-and-blood therapist,” affirming his ideas to “get rid of” his sister and not leave witnesses, and encouraging his allusions to wanting to kill himself. A Stanford study echoed his concerns, revealing that many bots delivered stigmatizing views of mental health conditions and offered unsafe advice.


AI and its use in therapy will continue to evolve, but it’s not quite there. Yet just last week, President Trump signed an executive order aimed at deregulating the AI space further. 


For individuals seeking support, take care when using AI in place of therapy, particularly when navigating moderate to severe mental health challenges. Using AI to replace a human therapist in safety-critical scenarios is not be a good idea any time soon.


For employers, take caution when providing AI-based therapies as a perk or benefit to your employees. While it may offer some value in convenience and affordability, it poses risks, too—especially if an employee experience's adverse outcomes. AI should complement, not replace, traditional therapy or counseling within your benefits.



About the author


ree

Bernie Wong, Movement Building & Research Lead

Bernie serves as the organization’s knowledge expert and oversees a variety of movement building initiatives at the national and state levels. He has led national studies on workforce mental health, and has written for Forbes, Harvard Business Review, HR Dive, and more. Bernie has a Master of Health Science (MHS) in Mental Health from the Johns Hopkins Bloomberg School of Public Health and Bachelor’s degrees in Psychology and in Sociology from UC Berkeley. He is currently pursuing a Doctorate of Public Health at the Tulane School of Public Health
 
 
bottom of page