
Without AI “Quiet Cars,” Learning Is At Risk
In the late 1990s, a group of commuters would board the early-morning Amtrak train from Philadelphia to Washington, D.C. They’d sit in the first car behind the locomotive, enjoying communal, consensual silence. Eventually and with the conductor’s help, their car was officially designated as a noise-free zone. Soon after, Denise LaBencki-Fullmer, an Amtrak manager, recognized the value of a peaceful ride and institutionalized the program as the quiet car. At the request of passengers, it soon spread to a number of other commuter services.
The educational technology sector has something to learn from the Amtrak commuters’ deliberate design of their environment. Learning requires the ability to concentrate. You need a space where you are allowed to process information, recall facts, analyze complex questions and think creatively about ideas, problems and solutions. Learning is not a smooth and easy process—in fact, it is desirable that it’s a bit difficult, because that is how we actually learn. Getting someone to do learning tasks for you, as tempting or comfortable as that might be, won’t work.
A great deal of learning still happens online, even at colleges that value in-person teaching as much as Princeton University does. The learning management system is where our students find readings, review lecture slides and practice their skills and comprehension on homework assignments. It is also where many instructors administer assessments, both low-stakes quizzes and high-stakes exams.
Last month, Google launched a feature called “Homework help” in Chrome—a shiny blue button right in the address bar. By engaging it, a student could prompt Google Gemini to summarize a reading or solve a quiz question in a matter of seconds. It thereby robbed the student of the learning activity that they were there to do. A few weeks later Google repositioned the feature so it is a bit less obvious (at least for now), but the question remains: What kind of AI tools should we make available to our students in learning management systems and assessment platforms?
You might be thinking that this is a pointless question: AI is going to be everywhere—it already is. And sure, that is true. Also, if a student wants to use AI, it is easy enough to open another browser tab and ask an LLM for help. But installing the AI right in the environment in which the student is trying to learn is equivalent to sitting next to the most obnoxious cell yeller on your train ride: You can’t think your own thoughts, because the distraction is so big.
Just as there are quiet cars on trains, there can be quiet areas of the internet. Learning management systems and assessment platforms should be one such area. That doesn’t mean that there can’t be good uses of AI in learning. Our students should know how to use AI responsibly, thoughtfully and critically, as should the faculty who teach them (I sometimes use AI in my own teaching, for instance). But we should also ask that the companies that provide us with learning technologies think critically and carefully about whether AI aids the difficult, careful work that learning requires or, in fact, removes the opportunity for it. AI is inevitable, but that doesn’t mean we can’t be intentional about how, why and where we implement it.
I have spent the last few weeks talking with colleagues at other colleges and universities and with the partners that provide our educational technology. Everyone I have spoken with cares about education, and none of them think it’s a good idea that we implement AI in a way that so clearly pulls students out of the learning process. It is actually not unrealistic that people in the tech industry and education sector come together to make the same kind of pact that the train commuters made some 25 years ago and declare our online learning systems an AI quiet zone. We would be doing the right thing by our students if we did.
Source link