News

An interview with Dr Alan Spivey

How has the rise in student use of generative AI changed your approach to teaching and assessment?

 So, I mean I think, the first thing I would say about the HEPI report is it’s- just a few caveats. It’s taken from about a thousand students, 1041, they’re all UK undergraduates, so full-time in various disciplines. And I’d say that it’s important that we as a university also support PGT and PGR students. But actually, I think the data that is presented in this report is very not unexpected, from my point of view. I think the data that we have from other sources and also from surveys at Imperial would suggest that GenAI tools are being used very widely. So, I guess, in direct answer to your question, how has the high use of GenAI tools affected my teaching and assessment, was that the question? Yeah, so in terms of my teaching, I don’t think it’s changed it very much. I mean, I’m a professor in chemistry, so I give lectures to undergraduates and to post graduate taught students on various aspects of synthetic organic chemistry. And I would say it hasn’t affected at all the way that I give those lectures and interact in those workshops, except that I draw attention to the fact that AI tools are being widely, now in industry, in the chemical industry, particularly in the pharmaceutical industry. And that this is changing some of the approach to drug discovery, for example. But the fundamental chemistry that I teach has not changed. In terms of the assessment, because most, in fact all of those assessments associated with those courses are in the form of an unseen examination held in the Great Hall, it also hasn’t changed much about the way that I set those. What I have done are little tweaks, like I used to mention the names of molecules in exam questions. I now don’t do that. So just little things that, yeah, I mean, make a bit of a difference in the exam. But if students are using those questions for practice, which we know that students do and I would encourage, it takes away that temptation to use tools, the internet to look up structures and the like. I think it has more impact on coursework and it just happens that my teaching doesn’t involve much coursework.

 

Has Imperial established any guidelines or policies for AI use in assessments? If so, what are the key principles?

 

Absolutely, so, if we go back to mid 2023, when ChatGPT jumped onto the scene, that’s when the working party that I chair was established, sort of a subgroup of the main university education student experience committee. And we held a town hall at which we presented some principles around how to, what we call “stress test” assessments. So, that allowed members of staff, teaching staff, to work with folks in our ICT but also in our tech labs to use, then the latest versions of the foundational GenAI tools, and see how they performed or how they could be used to help answer the sort of coursework questins that we were setting. Subsequent to that, to that town hall meeting, the educational development unit in the college set up a workshop which is available to all departments and they’ve quite a lot of bespoke workshops in the departments for staff where they work through the- how to make sure what you’re examining is aligned with your intended learning outcomes. And it’s not susceptible to pasting it into ChatGPT and getting out the sort of verbatim answer that would acceptable in terms of a assessment.

 

What do you think are the biggest opportunities in integrating AI into higher education?

 

Great question. I think we can almost certainly anticipate that GenAI is going to infiltrate most are of our lives going forward, particularly in the workplace. So, I think it’s incumbent on the university to make sure our students are trained appropriately and I think that means that we need to make sure that students have the opportunity to learn about these, the fundamentals of AI, and about the limitations of some of the, you know, sustainability concerns, ethics concerns, concerns about hallucinations, as they’re called, so sort of incorrect information being provided, about biases. But also about how they can save you huge amounts of time. You know, no one wants to do drudgery tasks. The way I see it is that if we work with AI then we can concentrate on the things that are exciting and add value and let the AI do some of the things which are relatively straightforward and are done really well by some of these GenAI tools. And I guess the other thing to bear in mind is that you know these tools are becoming more powerful rapidly and that’s a challenge and an opportunity, I think.

 

Do you think the use of generative AI will impact professional development and soft skills?

 

Interesting. I was chatting with someone just recently about using AI to write emails that they said that was the last thing they would use AI for, because for them, an email is a very sort of a personal piece of communication. I kind of agree with that, but I agree, you know, if you if I was in a position where I needed to write lots of very repetitive emails, then yeah, it makes absolute sense to set up a bot to help you, and then, you know, maybe just have a last editorial control over the output. I think it's all about using AI sensibly. You know, I think it's I also, you know, was doing something on the web the other day, and I put in a query to a company, and I got a reply, which was clearly from an AI bot. I was really frustrated because I wanted an answer from a person. So, I think we've just got to be very careful about how we use AI. It's got great opportunities, but also it can alienate, I think, very easily.

 

How do you think AI can be leveraged to support student learning without replacing critical thinking?

 

I mean, there are quite a lot of approaches that have been talked about quite widely in terms of- In Imperial it was certainly explored early on in the business school is- There was a module where, previously, the instructor had asked the students to write an essay about a certain topic at the end of the module, and he switched it around such that he asked students to use a generative AI tool of their choice, to provide a skeleton, and then to demonstrate how they added value to that by going through, fact checking, taking ideas, that in sort of thrown up in that. So I think you can do that kind of almost a conversation with a with a generative AI system, and demonstrate that you as an individual are adding and working with Gen AI. So I think that works quite well. I think the advent of Gen AI has also accelerated the process of trying to do authentic assessment, in other words, trying to set assessments which are as similar as possible to the sorts of tasks and challenges that you might meet in the in the workplace. So they're not artificial. They're not things which are just hurdles for students to jump over, where you know you have to demonstrate some particular pieces of specific knowledge, but you have to take that knowledge in the context of a scenario, and then apply it, so that the Gen AI can help you with the kind of the fundamental concepts, but actually sort of interpreting that in the context of a scenario. And I guess you're from medicine on you, so I guess it's perfect for that kind of thing, where you can, you can use the Gen AI to get you some, let's say, for. Facts and some principles, but then you have to apply them to a particular situation which is outlined.

 

How do you see the role of AI in higher education evolving over the next five years?

It's a really, really difficult question. Things are moving so fast. I mean, so sort of to answer that question, what we have done at the University is we have appointed what are- we call generative AI futurists. So these are members of staff who've had their time bought out. We have one in each faculty and one sort of covering the non faculty areas like the library and the CLCC, so the Center for languages, culture and communication. And the purpose of those individuals buying out that time was precisely this. It was saying, can we look into the future? Can we have a kind of crystal ball and ask ourselves what a curriculum of the future will look like in our sort of disciplinary areas? And so you know, they've been in place for about a year now, and actually we're looking to increase the amount of time that they have, because they've been liaising with staff, and a lot of what they've been doing has turned out to be around sort of, sort of training our staff to understand the capabilities of Gen AI. So, I mean, I'm kind of copping out of the question. It's very, very difficult to predict the future, but I think it will certainly shape our curricula quite dramatically. And I think move us into a place where, yeah, as I say, where the assessments, the way that we assess students and we gage sort of mastery of disciplines is very much rooted in the real world, in kind of, yeah, active learning and trying to simulate problems which will be faced in the workplace and which will allow, you know, students To solve the difficult problems that are, you know, faced in all kinds of walks of life. It's, yeah, it's, I mean, in chemistry, it probably is easier for me to say that, you know, AI is revolutionizing the way we make molecules, and it will soon be the case that you will be able to use a generative AI based algorithmic system to give you good suggestions as to how to make molecules and potentially actually synthesize using robots, the molecules you're after, and that will vastly accelerate drug discovery. So we need to make sure that that sort of revolution is reflected in our curriculum.

 

Is there anything else you’d like to add?

No, I don't think so. I mean, the other thing that I think is quite contentious around AI, which is something that we are exploring, because I also am involved in the Digital Education Fund. So this is a fund which allows staff, or potentially students, not necessarily academic staff, but anyway, anyone in the university, can apply for funds to look at educational projects. And we're funding quite a lot of projects looking at that kind of delicate balance between using AI to help provide rapid formative feedback and when that is appropriate and when a human intervention is appropriate. So getting that balance, because we can see that you can use AI to give students very rapid and quite rich feedback on things like mathematical solutions, but where does that line draw between when you have that sort of bot driven feedback and when you talk to a professor about it? I think finding that balance and taking students with us on that journey is super important because, you know, we don't want to alienate students and think, Oh, well, I'm getting marked by a bot. This isn't, this is not, you know, useful to me. That's not what I paid to come to Imperial for. But on the other hand, if you're getting better feedback and it's much more timely, then there's, I think, a balance to be to be found, and finding that balance is a is a super difficult thing, and we're funding quite a few projects in different departments to try and establish that. Yeah,

From Issue 1870

Discover stories from this section and more in the list of contents

Explore the edition

Read more