Students largely agree with the principle of the Teaching Excellence Framework (TEF) but are less convinced about the way it is currently implemented, according to a survey commissioned by a consortium of student unions, including Imperial College Union.

The Teaching excellence: the student perspective report analysed data from a survey of 8,994 students from 123 higher education institutions. It found 84% of students agreed universities should be encouraged to provide excellent teaching. This is the driving idea behind TEF, which uses three metrics to judge teaching quality and give universities a gold, silver, or bronze rating.

The current methodology considers graduate prospects, drop-out rates, and NSS scores. More than half of students favoured their feedback being used to evaluate teaching. However, the weighting given to NSS scores will be halved in the third year of TEF. This means student feedback will be “significantly less important” in future TEF rankings.

When asked to rank factors that demonstrate excellent teaching, graduate outcomes came bottom. While 68% of students said universities should be responsible if teaching is “not good enough to enable them to succeed”, only 34% believed universities should be held to account if job rates are low. Just 18% agree universities should be accountable when students drop out.

The vast majority of respondents said the TEF should also consider IT facilities, library services, and course-specific resources.

Most students were in favour of the medal ranking system but 3 in 5 were against linking TEF ratings to tuition fees. Under initial plans, universities with a gold would be able to increase their tuition fees in line with inflation. However this has since been impeded by Conservative plans to freeze tuition fees in a bid to win over young voters (Felix 1670).

Problems remain with the medal system. Half of students would reconsider or not apply to their current university if it had a bronze rating. For gold-rated universities, 6% of students would reconsider or not apply. This varied by ethnicity: BAME students were twice as likely not to apply to a gold-rated university compared to white students.

There were also fears that the rating of a university could harm graduate prospects. Students were concerned employers would look more favourably on someone with a 2:1 from a gold- or silver-rated university than someone with a first from a bronze-rated university. Furthermore, variation across departments could lead to gold-rated universities running bronze-rated courses and vice versa.

The Union said: “We’re pleased to have worked in partnership with over 20 other students’ unions to gain an insight into what students think of the Teaching Excellence Framework, and we’re proud that our research is having a real impact on the debate within the higher education sector. We look forward to raising the findings with College through our Academic Representation Network, highlighting the importance students place on teaching excellence and the quality of resources, rather than employment rates, when deciding if college has met their expectations.”

TEF ratings released earlier this year caused controversy when several Russell Group universities were given bronze ratings. Imperial was one of eight Russell Group universities that achieved gold status. At the time, Imperial provost Professor James Stirling said: “Excellence in education is at the very heart of our mission. Our teaching must be as innovative, agile and world-leading as our research.”

The survey was conducted by trendence UK and gathered quantitative and qualitative data from undergraduate and postgraduate students.

Universities are keen to achieve high TEF scores, since many feel they are attractive to potential students. However, six  universities have recently been reprimanded by the Advertising Standards Authority (ASA) over unsubstantiated claims in their marketing materials. Leicester University, Falmouth University, Teesside University, East Anglia University, the University of Strathclyde, and the University of West London were all instructed to remove misleading material from their websites, implying that universities were leaders in specifc subjects or regions.

Unintended Consequences

A consultation published in September 2016 revealed concerns that using graduate prospects would disadvantage universities offering courses in fields such as creative arts, where “highly-skilled employment” is less common. This would also be the case where graduates take a gap year or work in lower-level jobs before gaining highly-skilled employment. The report also revealed an “associated risk” of universities accepting fewer women, ethnic minorities, and people from disadvantaged backgrounds on to their courses in order to achieve a higher graduate employment rating.

Non-continuations are more commonly the result of university-related factors other than teaching quality. An “unintended consequence” of the non-continuance metric was also noted: universities may reduce the number of students who are “less likely to succeed” or make courses less demanding in order to improve retention. So the metric is not strongly linked to teaching quality and will potentially lead to accepting only privileged candidates that are more likely to succeed. But the measure remains, with the government insisting that non-continuation rates are “a good proxy for student engagement”.

Student satisfaction is useful in reflecting teaching quality so long as data from the “teaching on my course” questions of the National Student Survey are used rather than overall satisfaction rates. Problems here are largely to do with lower satisfaction scores awarded to female and ethnic minority academics. Not to worry though: government analysis insists there is no significant relationship between the proportion of female academics and satisfaction scores and only a “very small” relationship regarding ethnic minorities.