
Dartmouth Builds Its Own AI Chatbot for Student Well-Being
Dartmouth College is developing a new student-facing AI-powered chatbot to improve mental health and thriving on campus. A team of 130 undergraduate researchers are helping faculty develop the app, Evergreen, by examining relevant research and making the bot more conversational to student users.
The goal is to leverage artificial intelligence to provide personalized interventions for students, considering their needs, habits and overall health goals, said Nicholas Jacobson, associate professor of biomedical data science and psychiatry and one of the project leaders.
State of play: Mental health issues are one of the top reported barriers to student retention in higher education. A large share of college students report feeling depressed or anxious, according to national studies of student mental health.
At the same time, many college campuses face a shortage of available counselors, and in any case, a significant number of students are unaware of or hesitant to use campus resources for mental health support.
Some colleges are using technology to put support resources in students’ hands when they need it, including through tele–mental health counseling or digital mental health apps. Often, these apps are developed by a third party and sold to the college or university as an additional resource to support on-campus work.
What makes Evergreen different from other AI-powered tools on the market is the hands-on approach of employing student developers, the amount of data students can provide to the chatbot and the Dartmouth-specific interactions students may have.
How it works: Evergreen is a mobile app students can choose to download. Unlike other AI-powered large language models such as ChatGPT or Gemini, Evergreen focuses only on health topics, including exercise, diet, time management and sleep.
The chatbot offers targeted messages based on the student’s stated health goals and other linked data—including sleep hours, step counts, geolocation and learning management system information—to provide insights and create health plans. “We’re trying to gather a lot of contextual information and then use that information [with] AI to really power a lot of components,” Jacobson said.
Like a smartwatch reminding you to stand up or drink water, Evergreen will notice if a student who normally visits the fitness center hasn’t done so or encourage a student to leave the library after studying for six hours, Jacobson said.

Students using Evergreen’s chatbot, Evie, can select which data permissions the app has access to. These permissions can be toggled on and off whenever the student decides.
All data is encrypted and not viewable by Dartmouth’s administrators, Jacobson said, and students opt in to every piece of data they share with Evergreen. “They’re in complete control of what they want to do and when; they could say yes at the beginning and then say, ‘I don’t know about this anymore’ later.”
Looking ahead: The chatbot will roll out for its first phase of testing in a randomized controlled trial in fall 2026.
The first priority is to make sure Evergreen is effective and safe for use, Jacobson said. But there are opportunities to sell the tool to other colleges and universities.
“Other institutions are interested in this area, but academics are incredibly convinced by data,” Jacobson said. “We want to do it first: nail it before we scale it.”
One of the primary concerns of AI-powered chatbots for mental health support is that they can be unhelpful or even encourage unhealthy behaviors for young people looking to harm themselves. Evergreen will be equipped with a feature designed to recognize when a student is in crisis and notify their self-identified support team (which could include a parent, a friend or a university faculty or staff member). Jacobson compared the feature to crash-detection apps that automatically notify an individual’s emergency contacts.
Built by and for students: Since the start of Evergreen’s development last summer, Dartmouth has brought on undergraduate student researchers from various backgrounds to assist with the project. Each of the researchers is paid and commits to participating in weekly meetings as well as a certain number of work hours each week.
“We plan to continue to have undergraduates play a major part of leading this work, having a little over 130 undergraduates at any given time working on this,” Jacobson said, which is about 3 percent of the total campus population.
Ayush Saran, a Dartmouth junior studying economics, chose to join the project because of his experiences juggling responsibilities as a student athlete.
“I thought that Evergreen is definitely a good initiative to help students who are dealing with things like that: find their way around campus and balance time commitments,” Saran said.
One of the first projects student researchers were assigned was content creation, which involved reading clinical studies on a specific topic area and drafting information to feed to Evergreen. Saran and his peers are working on content about food, whereas Teddy Roberts, a senior majoring in modified government, creates dialogues related to exercise.
“They’re being really careful that everything the chatbot is going to say is something they can say is evidence-backed, and we’re specifically cutting things if we can’t 100 percent say that the evidence supports this claim,” Roberts said.
Jacobson noted that AI hallucinations result from a lack of data or a gap in the LLM, so Dartmouth is trying to get ahead of this by creating volumes of content about all kinds of interactions a student might have with Evergreen.
The first version of Evergreen will be structured dialogues—that is, students can follow a “choose-your-own-adventure” model of chatbot interaction. With the second iteration, students will be able to contribute free-written responses.
In addition to delivering reliable health information in a clear and relevant manner to Dartmouth students, researchers want the tool to understand the lingo and jargon of campus.
“We don’t want it to feel like you’re talking to a chatbot,” Roberts said. “A lot of it comes down to using the terminology that we all share as a common experience. We’ve been really trying to make it specific to our campus.”
Source link


