
AI and Our Next Conversations in Higher Education — Campus Technology
AI and Our Next Conversations in Higher Education
A Q&A with Instructure’s Ryan Lufkin
In recent years, technology industry press coverage has focused largely on the new and amazing capabilities AI offers. It seems like our dream functionalities have been delivered, with more yet to be imagined. And the play of tech giants on the world stage has been both entertaining and a little scary. This may feel like everything you could want in a major technological shift — but is it?
Happily, in the education market, we have another perspective. We still hear the voices of leaders asking us to consider what is our best use and adoption of the technology — just as they have always done when it comes to any groundbreaking technology applied in education. One such voice is Ryan Lufkin, vice president of global strategy for Instructure, makers of the market leading Canvas learning platform. Here, CT asks Lufkin how the focus of AI topics in education will move in the coming months, from the latest cool features and functions to the rigorous examination of implementations aimed to support the enduring values of our higher education institutions.
Mary Grush: In higher education, how will our discussions of AI change in the coming months?
Ryan Lufkin: In 2026, the AI conversation in education will shift from experimentation to accountability — and that’s a good thing.
In 2026, the AI conversation in education will shift from experimentation to accountability — and that’s a good thing.
Grush: It sounds like a really good thing! What are some areas where that will likely be manifest?
Lufkin: Institutions will need to focus on governance, including transparency, vendor selection and management, ethics, and academic integrity, while also showing what has actually improved.
Grush: That’s such an extensive range of things to consider. Over all, what’s the key, most important factor as the AI conversation in education shifts, as you say, from experimentation to accountability?
Lufkin: Without a doubt it’s our absolute requirement for student data privacy in training AI tools.
That is a hard and fast rule. And if you aren’t a vendor who’s experienced in the higher education space, you might think that rule is fungible, and it’s absolutely not. So, at Instructure we spend a lot of time working with our partners and our universities to say, look, as you’re choosing vendors, or as you’re building this AI infrastructure, you need to put data security, data privacy, and data accessibility as the non-fungible requirements for any of those processes.
Source link



