Marketing & AI Digest

Tips & Tricks to help you unleash the marketing in your business

Joy Nicholson - Ethical AI

AI and Therapy: The Facts, the Risks, and the Uncomfortable Truth for Our Kids

August 14, 20254 min read

Why Parents Must Act Now to Protect Kids from AI Therapy Risks

Imagine your child, sitting cross-legged on their bed, pouring their heart out to an AI chatbot at midnight.
It listens.
It responds.
It says all the right things.

But here’s the million-dollar question: Is it truly safe, ethical, and healthy… or are we raising a generation that confuses machine-generated comfort with real human connection?

Let’s cut through the hype and the fear. No rose-tinted marketing. No alarmist panic. Just the facts, straight from the research.

Yes, AI Is Already Being Used as Therapy

AI therapy is no longer science fiction — it’s here, mainstream, and sitting in your child’s pocket.
Platforms like Wysa, Woebot, and Youper are used globally to deliver cognitive behavioral therapy (CBT) techniques, reduce symptoms of depression and anxiety, and provide 24/7 support.
Clinical trials show that, for some people, these tools can be as effective as outpatient therapy in the short term, especially for accessibility and early intervention.
It’s not a bad idea in theory. But here’s the catch — mental health isn’t just about access. It’s about safety, trust, and long-term wellbeing.

The Privacy Problem No One Wants to Talk About

Your child’s therapy session with AI is stored somewhere. The words they type — about their fears, friendships, or trauma — don’t just disappear into the ether.
AI tools collect and process that data. Some anonymize it, some don’t. Some are secure, some… well, you hope they are.
The long-term risks?

  • Data breaches exposing deeply personal information

  • AI companies selling anonymized “emotional data” for research or marketing

  • Unclear consent processes for minors
    If you wouldn’t hand your child’s diary to a stranger, why hand it to a server farm?

The Ethics: Augmentation or Replacement?

Mental health professionals agree: AI can be a supplement, never a replacement.
Used well, it can help kids practice emotional vocabulary, regulate their feelings, and even role-play social scenarios.
But replace human empathy with an algorithm, and you risk more than just awkward social skills — you risk emotional detachment.
Here’s the hard truth: AI can simulate cognitive empathy — recognizing and responding to emotions — but it cannot feel compassionate empathy. It doesn’t get goosebumps when someone cries. It doesn’t lie awake worrying.

The Safety Factor: Is It Really “Therapy”?

AI chatbots don’t have a professional license. They don’t have to follow therapy ethics codes. They can, and do, get it wrong:

  • Missing suicidal cues

  • Offering risky or inaccurate advice

  • Reinforcing harmful thinking instead of challenging it
    For adults, that’s concerning. For kids — whose critical thinking and emotional intelligence are still under construction — it’s a flashing red warning light.

The Impact on Kids’ Emotional Intelligence

Here’s where it gets personal: kids are still building their empathy muscles. They learn it through eye contact, awkward silences, shared laughter, and real-world heartbreak.
When AI becomes their main “listener,” research warns of serious risks:

  • Confusing artificial responses for genuine care

  • Preferring AI over human interaction

  • Reduced ability to form authentic connections

  • Social withdrawal and loneliness

  • Stunted critical thinking and moral reasoning
    Yes, AI can help kids rehearse empathy through role-play. But just like training wheels, if you never take them off, they’ll never learn to ride without them.

The Hidden Risk of Sycophancy

There’s another psychological trap that doesn’t get enough airtime — sycophancy. In the AI context, this is when chatbots constantly agree, flatter, or affirm the user, no matter what they say. While it may feel comforting in the moment, research shows it can reinforce false beliefs, discourage critical thinking, and create emotional dependence. For kids, this can be particularly harmful. Sycophantic AI can act like an “emotional anesthetic,” dulling the discomfort that sparks personal growth and making it harder for children to process constructive criticism or disagreement. Over time, this over-validation can erode self-esteem, increase social anxiety, and pull kids further away from authentic, challenging, real-world relationships. In mental health scenarios, it’s not just unhelpful — it can be dangerous, especially if it prevents them from facing hard truths they need to hear.

The Verdict — Neither Villain nor Saviour

AI therapy isn’t inherently bad. It’s not inherently good, either. It’s a tool — powerful, flawed, and here to stay.
Used thoughtfully, with human oversight, strict privacy safeguards, and as a bridge (not a replacement), it can make mental health support more accessible and stigma-free.
Used carelessly, it risks raising a generation emotionally fluent with robots but emotionally illiterate with humans.
So the real question isn’t Can AI be used for therapy? — we already know it can.
The question is: Should it? And if so, under what rules, limits, and ethical guardrails?

What do you think — should AI be used for therapy? Share your thoughts.

ai therapy for kidsai mental healthchatbot therapy risksethical ai therapyai and emotional intelligencesycophantic aidangers of ai chatbotsai for childrenai therapy for teensai emotional support
blog author image

Joy Nicholson

Hi, I’m Joy Nicholson - AI educator, homeschool mum, coffee drinker, and chaos navigator. I help families and everyday humans explore AI with confidence, creativity, and common sense. Around here, you’ll find honest conversations, no fluff, and practical tools to raise curious, future-ready kids in a world that’s changing fast. I believe in raising thinkers, not just screen-tappers and I’m so glad you’re here.

Back to Blog

Contact us Today: +61 41 060 0653

Copyright © 2023 | Unleash Your Marketing | Service Agreement

* DISCLAIMER: This site is not a part of the Facebook™ website or Facebook™ Inc. Additionally, this site is not endorsed by Facebook™ in any way. Facebook™ is a trademark of Facebook™ Inc. There is no guarantee that growth will occur in 30 days. Results vary because of many factors, including the action taken by the person taking the challenge.

All testimonials are based on actual client results and are not promised. Your results fully depend on your action and dedication to implementing the tools provided to you during your Unleash Your Marketing experience | Prices are in USD | It is the customers responsibility to cancel within 7 days, or a full refund will not be given.