Wed. Jun 25th, 2025
AI Therapy? | New TikTok Trend

Syracuse, N.Y. (NCC News) — From chatbots to mental health apps, artificial intelligence (AI) is becoming an increasingly popular tool for people seeking support in managing their emotions. While AI is not a replacement for licensed professionals, platforms like Tiktok have people discussing how they use ChatGPT for therapy to vent, reflect, and receive advice on navigating daily challenges.

For Janae Jackson, a 29-year-old from New York City, AI therapy has become a convenient outlet for her thoughts. Whether seeking advice on dating or brainstorming new creative projects, Jackson finds AI to be an accessible and supportive resource.

“I talk to ChatGPT a lot about my creative ideas and how to take action behind those ideas,” Jackson said. “It’s super useful for that, and I occasionally use it for dating advice too.”

This growing trend reflects the broader appeal of AI-powered mental health tools, particularly among individuals who may face barriers to traditional therapy. Many people, including Jackson, find AI chatbots beneficial because they can offer instant feedback and tools like mindfulness exercises or cognitive behavioral therapy techniques. 

This can be especially helpful for those who struggle with access to therapy due to cost, availability, or scheduling conflicts.

“It feels nice to have an objective listener at all times. If I am feeling negative about an interaction or situation, I briefly unpack it with ChatGPT, and it’s like a little conversation with a friend,” Jackson explained. “It’s not judgmental, which is comforting.”

However, as AI-powered tools continue to evolve, experts caution that these platforms should be seen as a supplement to, not a replacement for, professional mental health services.

Afton Kapuscinski, a psychologist at Syracuse University, emphasizes the importance of human connection and clinical expertise in the healing process. While AI tools can provide helpful coping strategies, they lack the emotional intelligence and nuanced understanding that a licensed therapist can offer.

“The responses feel a lot to me like cliches or generalized input that could apply to most people in most situations, but it’s not individualized to the unique needs of the person,” Kapuscinski said. “AI doesn’t know you as a person, and that’s something a real therapist brings to the table.”

For example, Kapuscinski notes that while AI can suggest breathing exercises or encourage positive thinking, it cannot adapt to the deeper, often complex emotional struggles that a person might face over time. AI also doesn’t have the ability to recognize when someone might need more urgent care, such as in cases of severe depression or anxiety and suicidal thoughts. 

Despite the risks, as these tools improve, they are likely to become an increasingly integral part of mental health care for many people.

“Right now, it feels helpful at the moment, but where is the limit going to be?” Kapuscinski said. “AI may be good for initial support or to feel less lonely, but I would not rely on it for long-term mental health care.”

Some users, like Jackson, acknowledge the limitations of AI but continue to find value in the assistance it provides. They view it as a helpful stopgap when they cannot reach a therapist or need a quick emotional outlet.

“I definitely think AI has its place,” Jackson said. “It won’t replace therapy, I still go, but it’s nice to have another option when I need it. I wouldn’t use it as my only form of support, but it definitely helps me get through some rough moments.”

As AI continues to advance, Jackson  hopes that it can serve as one piece of a much larger mental health puzzle, providing individuals with a low-pressure, immediate way to reflect, recharge, and cope with stress.

Kapuscinski agrees, noting that, “The key is balancing AI tools is great, but it cannot pick up trends such as avoiding conversations or patterns like a therapist would in a session.”