Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Therapist CE Offerings on AI: Why It Matters Now, and What’s Missing

 • 
Aug 21, 2025

Therapist CE Offerings on AI: Why It Matters Now, and What’s Missing

In Brief

The rise of artificial intelligence in mental health care isn’t coming, it’s happening now, in ways both subtle and profound. AI-powered systems are increasingly woven into therapeutic practice: transcribing sessions, flagging emotional shifts, supporting assessment, and even direct-to-consumer AI “therapists” are already on the market. 

Legislators are beginning to stir, consumer groups are raising alarms, and your clients are already seeing the headlines. That makes one thing clear: if you’re a practicing therapist, you need to understand AI’s possibilities, limits, and risks – because your work, your clients, and your ethics may depend on it.

Continuing Education Units (CEUs) offer one clear path to responsible engagement with AI tools. But at the moment, they're fragmentary. Therapists need the resources to not only keep pace with technology, they must shape its ethical deployment. In other words, we need more familiarity with artificial intelligence, grounded in clinical responsibility, professional ethics, and true accessibility. Education is your entry point into influencing how this unfolds.

The AI Education Gaps for Therapists

AI in therapy isn’t (yet) part of most graduate programs, and licensing boards rarely reference it directly. Which means its distinct risks, such as algorithmic bias, privacy vulnerabilities, and the impossibility of applying relational ethics to a chatbot that can’t feel, aren’t addressed. That said, what does exist now for therapists falls into a few categories:

  • Introductory webinars offered by professional organizations or state associations, often explaining AI basics but light on clinical scenarios.
  • Ethics-focused workshops that add AI as a bullet point in broader telehealth discussions – which while useful, can still be surface-level.
  • Vendor-led trainings by companies selling AI-enabled tools, which means the content is shaped to highlight benefits over risks.
  • Academic talks and panels from psychology departments or research centers, often informative but disconnected from the day-to-day realities of clinical practice.

In short, the available training is scattered, inconsistent in quality, and rarely comprehensive. You might leave knowing the definition of “machine learning” but without a framework for deciding when (or whether) to integrate AI into treatment.

What’s Available, But Often Just the First Step

Surprisingly, there are now multiple CEU offerings on AI and mental health. But most remain introductory, more flash than foundation, a few examples include:

  • Free State Social Work offers widely used, inexpensive courses (~$6/hour), such as “To Chat or Bot to Chat,” which introduces a five-principle ethical framework for chatbot use. Another focuses on user experiences of generative AI. While valuable, they’re generally short and high-level.
  • PsychCE's "Artificial Intelligence in Therapy: Applications and Ethical Considerations" is APA-approved and covers empirical frameworks, DEI considerations, and case studies. A strong foundation, though still largely conceptual.
  • Clearly Clinical's podcast-based CE explains how AI is entering mental healthcare, with a focus on HIPAA and professional liabilities. Accessible, but still theory-heavy.
  • Person-Centered Tech offers Using AI as a Mental Health Clinician, which includes a legal-ethical decision tree and sample AI-generated notes. Practical, but with limited reach.
  • UCEBT’s three-part series, “AI & Ethics in Mental Health,” provides deep ethical and regulatory exploration, APA alignment, and supportive frameworks, but with limited interactivity.
  • NetCE’s “AI in Health Care” offers broad, interdisciplinary coverage on AI's role and limitations across healthcare, including behavioral health. Useful for foundational knowledge, but not therapy-specific.
  • The Knowledge Tree’s workshop on AI in telemental health includes live sessions, supervision considerations, and ethics, which is good only for telehealth-centered clinicians.
  • And a few academic institutions offer general AI-in-psychology CE, though therapist-specific clinical guidance remains rare.

These courses are important first steps, but they stop far short of providing clinicians with the hands-on, regulatory-aware, equity-centered skills they need.

Topics Still Missing from CE Courses on AI

The real problem is that existing trainings often avoid the hard questions therapists are asking in supervision and peer consultation:

  • How do I vet an AI tool for clinical appropriateness?
  • What if my client is already using an AI therapist? Do I integrate it, discourage it, or explore it as a therapeutic topic?
  • How do I document informed consent when AI is involved?
  • How do I address algorithmic bias against clients with substance use disorders or schizophrenia, which studies have shown is a real problem (Stanford study, 2023)?
  • What safeguards can I realistically put in place to protect my clients’ privacy when AI tools may process data overseas or through opaque systems?

As useful as current offerings are, they leave critical legal and ethical gaps:

Hallucination & Error Literacy: AI often generates plausible-sounding but false or dangerous content. That may include spurious psychiatric labels or instructions that threaten client safety. Therapists need CE experiences that demonstrate how to detect and correct hallucinations, and how to document the decision-making process.

Bias & Cultural Safety: Without training, therapists may miss how AI algorithms routinely misclassify or misunderstand symptoms in marginalized populations, like psychosis in Black clients or substance use in LGBTQ+ clients. Only a few CE courses mention bias, and virtually none show how to push back.

Legal & Regulatory Awareness: Therapists work under licensing boards, professional associations, and state rules, and these are evolving rapidly. Therapists need CEU guidance on laws like Illinois’s WOPR Act or Utah’s AI disclosure mandates. That means location-based training with actionable language for consent forms and documentation.

Simulated Risk Management: Theory is not enough. Therapists need interactive simulations. Let them encounter AI outputs, decide whether to override them, record their reasoning, and reflect on outcomes. Without that, CEUs remain abstract.

Equitable Access to Education: Clinics serving marginalized communities or underfunded schools often can’t afford CEU fees. Everything we call “essential” must be accessible to all therapists, not just those in private practice.

Building the Future: What CEUs Should Include

To meet the real needs of therapists, future CEU programs should address:

  • Clinical integration strategies: When (if ever) it’s appropriate to fold AI into treatment plans, and how to set boundaries.
  • Informed consent scripts and templates: Ready-to-use language that aligns with ACA/APA standards.
  • Risk assessment for AI tools: Practical criteria for deciding if a tool is safe for your setting.
  • Bias detection: Understanding and addressing how AI may pathologize certain populations.
  • Crisis considerations: Why AI tools are unreliable for suicide risk and other high-stakes interventions.
  • Client education: How to talk to clients who are using AI tools on their own, without endorsing or dismissing outright.
  • Legislation and advocacy: What’s happening in your state, and how to influence it.

Where to Start

You don’t need to wait for perfect, comprehensive training to begin building your literacy. A few starting points to consider are:

  • Audit your current tools Ask yourself which, if any of the tools you already use include AI? How is data stored? What do your terms of service say?
  • Sign up for at least one reputable CEU on AI in therapy this year, ideally from a neutral provider.
  • Follow thought leaders in both mental health and tech ethics (ideally people who will challenge industry hype).
  • Join conversations in your professional networks to share resources and dilemmas.
  • Track legislation in your state, so you’re not caught off guard by new compliance requirements.

AI in therapy is no longer a future-facing question, it’s a present-day professional competency. Without adequate education, therapists risk being sidelined in decisions that will shape their work for decades. With it, you can be at the table, speaking from clinical expertise, not catching up from the back row.

Share this article
Subscribe to The Golden Thread

The business, art, and science of being a therapist.

Subscribe to The Golden Thread and get updates directly in your inbox.
By subscribing, you agree to receive marketing emails from Blueprint.
We’ll handle your info according to our privacy statement.

You’re subscribed!

Oops! Something went wrong while submitting the form.