Business School considers “AI avatars” of teaching staff
Imperial Business School is looking to roll out “AI avatars” of its teaching staff for selected modules following a pilot programme.
The “avatars,” life-like AI-generated simulations of consenting professors, would be accessible via existing teaching platforms such as Insendi.
Trained on lecture slides, readings, and other course materials provided by module leaders, the avatars can answer students’ questions on demand, with responses tailored specifically to course content.
The debut would follow two years of pilots run by Imperial’s IDEA Lab, where the software was tested by students across online and blended Business School programmes. AI-generated avatars of Imperial Professors Mark Kennedy, Omar Merlo, and Bart Clarysse – named “MarkBot,” “OmarBot,” and “BartBot” respectively – were created using existing recordings of the academic staff made for online module content.
“They’re trained to simulate as much as possible the interaction that the students would have with me over a live call,” Professor Merlo told Felix.
“You’re not getting some generic answer from some generic AI that hasn’t sat through my course – you get an answer from somebody who has written the slides, written the book, prepared the content, delivered the lecture,” he continued.
Recounting their experience using the tool, a student told the university’s press team: “It’s him, just in another form. It makes it feel more like he’s here with us.”
The initiative is an attempt to ensure equal access to AI tools across the Business School, and to establish a degree of control over how such tools are used by students, the IDEA Lab told Felix.
The university prohibits students from uploading course content onto third-party AI tools such as ChatGPT and NotebookLM without consent from teaching staff. Meanwhile, research conducted by the Business School found that over 80% of Imperial students regularly use Generative AI for their studies.
By using its own platform, the Business School can examine anonymous learning analytics, including transcripts of conversations between student users and avatars. This, the IDEA Lab explained, would enable module leaders to identify areas where students are less confident and where additional clarification might be needed in upcoming teaching sessions.
“All of this helps the school improve the quality of its teaching,” they said.
Surveys carried out by the Business School in 2024 revealed demand for AI avatars, with 61% of respondents saying they were likely to use them for their studies. 10-15% of students said they found the idea unappealing and were unlikely to adopt the technology.
The IDEA Lab emphasised that the avatars are not intended as a replacement for face-to-face teaching. “We are keen to reassure students that our goal is not to replace the human elements of teaching with AI tools, nor to supplant any existing aspects of our programmes,” they insisted.
The team is hoping that the avatars will encourage critical thinking among students, instead of just “providing them with answers.”
Scholars have been raising concerns about a rising dependence on AI chatbots for education, threatening students’ abilities to think for themselves and engage deeply with complex materials. At the same time, increasing use of AI among teaching staff is sparking fears of education being reduced to “a conversation between two robots.”
“I think it would be foolish to hide from it, to pretend it’s not there, or to impose heavy restrictions on how students should use it,” Professor Merlo says of AI in education.
The IDEA Lab is due to release the results of its pilot phase this spring. “We continue to work closely with module leads and academic directors to discuss embedding AI into their modules and programmes, including through the use of AI-support bots, and are considering the impact on student experience, learning and teaching pedagogy,” they said.