
Helping College Students Emotionally Before They Turn to AI
Some ChatGPT users ask the chat bot for emotional support.
Photo illustration by Justin Morrison/Inside Higher Ed | Kirillm/iStock/Getty Images
As more students engage with generative artificial intelligence and chat bots, the ways they use AI are changing. A 2025 report published by the Harvard Business Review found that, according to the discourse on social media, “therapy/companionship” is the No. 1 use case for generative AI chat bots.
For college counseling centers, this change reflects students’ desire for immediate support. “This is not a generation that would call a counseling center and get an appointment two weeks, four weeks later,” said Joy Himmell, director of counseling services for Old Dominion University. “They want help when they want it.”
But it’s important for counseling centers to educate students on the risks of using generative AI tools for well-being support, Himmell said.
The research: While ChatGPT and similar text-generating chat bots are touted as productivity tools that can expedite learning and workflow, some people turn to them for personal and emotional support.
According to a 2024 safety report, OpenAI found that some users experience anthropomorphization—attributing humanlike behaviors and characteristics to nonhuman entities—and form social relationships with the AI. Researchers hypothesized that humanlike socialization with an AI model could affect how individuals interact with other people and hamper building healthy relationship skills.
A 2025 study from MIT Media Lab and Open AI found that high usage of ChatGPT correlates with increased dependency on the AI tool, with heavy users more likely to consider ChatGPT a “friend” and to consider messaging with ChatGPT more comfortable than face-to-face interactions. However, researchers noted that only a small share of ChatGPT users are affected to that extent or report emotional distress from excessive use.
Another study from the same groups found that higher daily usage of ChatGPT correlated with increased loneliness, dependence and problematic use of the tool, as well as lower socialization with other humans.
In extreme cases, individuals have created entirely fabricated lives and romantic relationships with AI, which can result in deep feelings and real hurt when the technology is updated.
This research shows that most people, even heavy users of ChatGPT, are not seeking emotional support from the chat bot and do not become dependent on it. Among college students, a minority want AI to provide well-being support, according to a different survey. A study from WGU Labs found that 41 percent of online learners would be comfortable with AI suggesting mental health strategies based on a student’s data, compared to 38 percent who said they would be somewhat or very uncomfortable with such use.
In higher education: On campus, Himmell has seen a growing number of students start counseling for anxiety disorders, depression and a history of trauma. Students are also notably lonelier, she said, and less likely to engage with peers on campus or attend events.
Student mental health is a top retention concern, but few counseling centers have capacity to provide one-on-one support to everyone who needs it. At her center, more students prefer in-person counseling sessions, which Himmell attributes to them wanting to feel more grounded and connected. But many still engage with online or digital interventions as well.
A significant number of colleges have established partnerships with digital mental health service providers to complement in-person services, particularly since the COVID-19 pandemic necessitated remote instruction. Such services could include counseling support or skill-building education to reduce the need for intensive in-person counseling.
Digital mental health resources cannot replace some forms of therapy or risk assessment, Himmell said, but they can augment counseling sessions. “Having automated AI systems with emotional intelligence to be able to convey some of those concepts and work with students, in some ways, it actually frees the counselor in terms of doing that kind of [skill building], so that we can get more into the nitty-gritty of what we need to talk about,” she explained.
AI counseling or online engagement with ChatGPT is not a solution to all problems, Himmell said. For those who use chat bots as companions, “it sets up a system that is not based in reality; it’s a facade,” Himmell said. “Even though that can serve a purpose, in the long run, it really doesn’t bode well for emotional or social skill development.”
Faculty and staff need to learn how to identify students at risk of developing AI dependency. Compared to anxiety or depression, which have more visible cues in the classroom, “the symptomology related to that inner world of AI and not engaging with others in ways that are helpful is much more benign,” Himmell said. Campus stakeholders can watch out for students who are disengaged socially or reluctant to engage in group work to help identify social isolation and possible digital dependency.
AI in the counseling center: Part of addressing student AI dependency is becoming familiar with the tools and helping students learn to use them appropriately, Himmell said. “We need to be able to harness it and use it, not be afraid of it, and embrace it,” she said. She also sees a role for counseling centers and others in higher education to provide additional education on AI in different formats and venues.
Old Dominion partners with TalkCampus, which offers 24-7 peer-based support. The counseling service is not automated, but the platform uses AI to mine the data and identify risk factors that may come up in conversation and provide support if needed.
Source link