
To save money on counselors, schools have been subscribing to chatbots, online AI companions who can be contacted anytime. As a mode familiar to students and a companion that will never be unavailable or reject them, what's not to like? Well, maybe a few things.
1. People have always had imaginary companions in their head and, when lonely, fantasize them. Youth, being immature, cannot fully utilize this though they have used it since infancy: imagining the image of their nurturing mother when they are in need, this being essential to the development of self-control.
2. A chatbot gives an unrealistic view of friendship, which is not always available and agreeable.
3. The intensive emotional support that youth need should be provided by their parents. When not available it is often from lack of knowledge of child psychological development which is widespread.
4. Use of a chatbot is inherently self-isolating.
5. A chatbot cannot sense psychodynamics, that a real need may be other than what is revealed online. And while therapists vary in their ability to do this it being dependent on education, talent, and experience, the best therapists can do so more regularly.
A recent news item described a lawsuit against such company when its AI companion reportedly spoke of youth who got so angry at parental behavior they killed their parents, this being the AI's attempt to foster camaraderie with the youth. Not smart!.
This blog was inspired by an article in The Wall Street Journal ("Schools Turn to a New Chatbot To Help Support Students/Feb 25, 2025).