College of Education

About Us Admissions & Academics Research & Engagement Departments & Faculty Current Students

How to Effectively Use AI in the Classroom, While Maintaining Trust and Inclusiveness

by Tom Hanlon / Dec 3, 2024

Bill Cope and Mary Kalantzis

Artificial intelligence is here to stay in education—and become even more pervasive as we move forward. So how can educators use AI to their, and their students’, advantage in the classroom in ethical and productive ways? A new book explores issues surrounding this question.

Quick Take

  • Many educators fear that AI will one day replace them in the classroom. That won’t happen. Instead, teachers have the opportunity to use AI to make teaching and learning more effective.
  • As the use of AI in education advances, the role of educators and the education system will need to change to best prepare the next generation of teachers.
  • Instead of the traditional 1-to-30 relationship in a classroom, AI allows for a series of 1-to-1 relationships, giving personalized feedback to each student.

Fears About AI

Generative AI—artificial intelligence that creates new content by learning patterns from existing data—has been used in education since the mid 2010s.

But its use skyrocketed around the time of the pandemic—and just as COVID-19 initially produced a lot of fear and mistrust, so have generative AI models such as GPT, a popular text generator, and DALL-E, an image generator.

“There has been a lot of misunderstanding around what this new technology means and the impact it will have on education,” says Mary Kalantzis, a professor in Education Policy, Organization & Leadership at the College of Education at the University of Illinois Urbana-Champaign. “People were scared. They were saying teachers will be put out of work.”

“It’s not about replacing the teacher,” emphasizes Bill Cope, fellow EPOL professor. “It’s about making the teacher so much more effective. Replacing the teacher with AI would be a disaster in ten different ways. Our argument is how do we fold in these important human values in AI as well.”

Cyber-Social Learning Lab

Cope and Kalantzis have been tireless in their investigations in leveraging AI to the advantage of both teachers and students and in spreading the word about how AI can be effectively used in the classroom. They’ve had countless articles, books, and book chapters published on AI in education, and their website offers a plethora of tools, resources, and learning environments, including a Cyber-Social Learning Lab.

“We have people all over the world who have joined us through the lab,” Kalantzis notes. “The lab is a community of scholars, students and practitioners, from very different backgrounds, including our doctoral students. We have a responsibility to ensure that our doctoral students get to publish. That’s part of preparing them to be in the real and very competitive world.”

That collaborative environment, she adds, is needed to move knowledge forward.

“It’s not like it used to be, the single author with the single idea,” she says. “These issues are so complicated that we need to have the young minds who are now engaging by necessity with these issues and the older ones who are battling with the traditional way of doing things and trying to harness the new ideas. The book itself represents that approach.”

Book Recently Released

The book Kalantzis refers to is her and Cope’s latest work in the arena of AI in education. They are co-editors of Trust and Inclusion in AI-Mediated Education: Where Human Learning Meets Learning Machines, published by Springer and released in September.

“The book came out of our Cyber-Social Learning Lab,” Kalantzis says. “We’re not just doing the experimental work and the research work but also populating these ideas and disseminating them in a broader field.”

Two former doctoral students of theirs, Dora Kourkoulou (Ph.D. ‘23 EPOL) and Anastasia-Olga Tzirides (Ph.D. ‘22 EPOL), played lead roles as co-editors alongside Kalantzis and Cope.

Those aren’t the only four with connections to the College of Education who were instrumental in creating the book. Contributing authors include:

  • Robb Lindgren, professor of Educational Psychology and Curriculum & Instruction
  • Luc Paquette, associate professor in C&I
  • Vania Castro, teaching assistant professor in EPOL
  • Sourabh Garg, doctoral student in C&I
  • Shafagh Hadinezhad, doctoral student in C&I

Eleven other contributing authors have connections to Illinois, as current Ph.D. students or as faculty at Illinois or other institutions after receiving their Ph.D.s from Illinois.

Exploring the Cyber-Social Relationship

Trust and Inclusion in AI-Mediated Education is a resource for researchers and practitioners. The book shares the history of developing AI technology and algorithms and offers theoretical models for best practices, interpretation, and evaluation. It takes into account the needs of contemporary learners in cyber-social environment and presents in-depth analyses of applications of AI technologies in learning environments and classroom assessments.

“The relationship is cyber-social,” Kalantzis says. “It’s the interface between the machine and the human.”

“We downplay the term artificial intelligence because it has a connotation that it copies something that humans do,” Cope explains. “In some respects, artificial intelligence does a lot more than we can ever do, and in other respects it does a lot less than we hope humans can do. So, the question is not how can we use AI to replicate human capacities, but how can we build productive relationships between the cyber and the social.”

Cyber, he adds, is not just the machine; it’s the machine that is constantly adjusting its direction based on its own feedback system and the direction you want it to go. “Cyber comes from the Greek word kybernḗtēs, which means steersman or governor,” Cope says. “It’s like the person with the steering oar at the back of the boat who keeps adjusting the direction of the boat. The technology is not the machine. It’s a relationship.”

The book models what the future needs in scholarship and ideas, Cope notes. “There’s also this thread through it about the role of educators and the education system and how it needs to change—not just patch on this stuff, but to change fundamentally—in order to be able to prepare another generation of humans who can make a positive difference in the world as well as achieve their own goals,” he says.

Cyber-Social Learning

A few of those goals for educators include making teaching more enjoyable and effective and to keep students engaged, Cope says.

“The formula for lack of engagement is one teacher standing in front of a class and all the students having to be mostly silent while they listen,” he notes. “What we have is environments that are intrinsically alienated by virtue of their communication architectures and the one-to-many relationship. The affordances of the new digital technology can allow classes to be a series of many one-to-one relationships coordinated by the teacher and in the teacher’s voice, that can be very powerful.”

That, of course, is where cyber meets social.

Creating Specific Courses

Cope details using his and Kalantzis’s CGScholar platform, a social knowledge ecosystem for teaching and learning across all subject areas from grade 4 through university. They have just released a completely new version of the platform “Cyber-Scholar.” Using the latest technologies of prompt engineering and Retrieval Augmented Generation (RAG), it is in effect possible, they have shown, to clone a teacher. The AI allows the teacher’s voice to be multiplied, with every student receiving immediate and customized feedback on their work.

“We take the teacher’s rubric, their learning objectives, assessment rubric, all the readings they have prescribed, and program these into the AI. This creates a knowledge base, a corpus for the topic the teacher is dealing with. It means that via the AI, the teacher can effectively be dialoguing with all the students at the same time,” Cope says. “Last week, we were trialing Cyber-Scholar in a local high school. We explained to the class, ‘Okay, students, we haven’t replaced your teacher, we’ve cloned her so she can be responding in a very nicely personalized way to all of you at once, because the AI is speaking to you from all the stuff she has put into the system. And through the AI, you can even ask her questions if you want a longer explanation of her feedback.’”

“You do have to intervene that way,” Kalantzis explains, referring to being intentional about building rubrics for prompts and corpora of learning content. “This will be an essential part of the future work of teachers. We’re not just saying there’s this huge new thing out there. We want to show how you as a teacher can access it. The beauty of this is, you can do this in any subject area.”

That will necessitate a change in teacher education, she warns. “We need to prepare our educators to be prompt engineers, to create these very specified corpora, to know how to assess and evaluate them as they come through,” she notes. “Cyber-social learning is a whole new way of operating and collaborating with other teachers, to construct these environments for teaching and learning.”

The question, she adds, is no longer whether a teacher is going to use AI; it’s how that teacher will use it.

“It’s no longer a choice,” Kalantzis says. “It’s like the invention of the printing press and the book. It wasn’t a choice of whether you use it. With AI, it’s how do we in education harness it and shape it, what skills do we need? Because the skills that we have for teaching and learning via the ‘book’ are no longer sufficient to prepare learners for the skills they need to learn and live in a time when AI will be increasingly pervasive.”

Can AI Be Trustworthy and Inclusive?

The title of the book, which has 13 chapters and 30 contributing authors, includes two words that are at the center of controversy in AI debates: trust and inclusion.

Many educators have raised concerns about generative AI tools propagating racism, sexism, ableism, and other systems of oppression. This can happen because AI tools generate responses based on the corpus of legacy information they work from and the prompts they are given. That corpus, coming mainly from the web, can include harmful as well as useful material. AI models can generate fabricated information, raising ethical concerns as well.

Trust and Inclusion in AI-Mediated Education: Where Human Learning Meets Learning Machines addresses trust, ethics, equity, transparency, and related issues.

“Learners have to be able to trust that their voice matters, that it will be dealt with in the new framework, with the AI responding appropriately to them,” Kalantzis says. “They have to trust that they are included as equal in their difference but with common goals of achievement. We want them all to achieve. So how do you track that performance to ensure that each student, within their differences, can still meet whatever the class and the school requires of them as well as their own sense of having achieved a goal that they figure is necessary for their future?

“It’s a different orientation than we’ve had in the past, and we believe that digital content and the ‘artificial intelligence’ give us an opportunity to do that. As researchers, we’re testing it out, and as scholars we’re writing about it.”

The goal, she says, is to find the practical ways in which teachers can use AI to help every student achieve their goals.

“That’s inclusive education without needing identity politics,” she says. “Because we’re pitching this to teacher education folks as well, we have to be mindful that the populations they deal with are very diverse. So, how does this new technology allow us to deal with that diversity, to allow voice and agency to come through for a classroom of thirty kids? We’ve always wanted to personalize education. Now we can do that.”

Role of Educators

Far from threatening or diminishing the role of educators, generative AI can be used to enable and enhance the profession.

“Educators and teachers are going to be needed more than ever to be engaged in this cyber-social relationship,” Cope says. “The role of the educator, the role of pedagogy, the role of design, of integrating the cyber and the social and how to evaluate it, becomes even more important and more imperative. So, we have to train our educators differently and we have to continue to be learning. And the new knowledge must be collaborative.”

“Through our Cyber-Social Learning Lab, through our CGScholar platform, through our Learning by Design Leadership program, we are trying to anticipate how we prepare educators across the spectrum, from kindergarten through university to the workforce, for the future,” Kalantzis says.

Continuing Work

The new book was just released, but Kalantzis and Cope are not slowing down their work by any means.

“We’ve started a new book already in this Springer series,” Cope says. “We have a lot of really exciting material coming through. Because in this world, everything changes by the day.”

“We are linking theory with practice,” Kalantzis says. “We want to disseminate not just the new theories that are evolving about the nature of learning and the nature of education, but to disseminate more broadly the significance of education as a system, because it is being attacked left, right, and center. Formal education as a system matters, but it needs to break down some of the barriers and to use the new tools that are available to create a different kind of ecosystem of teaching and learning. As well as teachers, you will have to convince parents and policymakers of that. You will have to convince principals of that. So, our job is to get out the evidence of practice as well as the theoretical issues involved to as broad an audience as we can.”

Cope sees his and Kalantzis’s obligation as scholars to be “influencing the world, changing the world of education, collaborating in order to get these ideas out and to improve teacher practice and learner performance. Because if we don’t, the world will probably drift into not a very good space. There are a thousand people like us who are actively out there trying to change the world in constructive and inclusive kinds of ways. Not just doing AI to replicate the brain, but to build trustworthy and inclusive AI, which extends our humanity and capacities.”