at the end of this posting.
With the impending summer examinations cancelled, or hastily arranged online, the coronavirus pandemic has applied the brakes for students in a very dramatic fashion. Most university students are currently at a loose end trying to study for something that might not happen, or might not be as expected. School and college students trying for university will have to rely on their teacher’s assessment. The fear is that the system will be unfair and favour the more advantaged students. This fear has been raised in several quarters and must be addressed by universities and the various examination boards. Equality of opportunity and fairness should be central to the system. Addressing this offers a chance to improve how assessments are carried out in the future and bring into sharp focus the inequalities that already exist.
The examination ‘nightmare’ for many students at University.
The Guardian revealed on Wednesday how university examinations would impact on students of different means. With ‘It's a ‘nightmare': how coronavirus is wreaking havoc on students' exams’, the many reasons to consider cancelling exams were well outlined. These mirror concerns raised by TEFS on 16th March 2020 with ‘Impact of Coronavirus measures on the working student: The nudge that breaks the camel’s back’. This raised the many problems facing students, not just examinations. Many students will have lost their part-time jobs and not be eligible for sick pay. Accommodation will become problematical for many and access to internet facilities difficult as the libraries close. Some will have no ‘home’ to go to. The existing problems of inequality will be greatly magnified by the pandemic measures.
Third-year undergraduate, Simon Hunt at Oxford University, writing in Times Higher today, added to the worries with ‘Students need maximum flexibility for assessments from home’. He reiterates the same issues for students and calls upon the university to respect “the diversity of wishes by different students, which can only reflect the diversity of their circumstances”.
In the Guardian article, final year Cambridge student, Daniel Wittenberg, gives a very cogent reason why the examinations must be cancelled. “We’ve spent our whole lives preparing for a very different type of exam…. The university [can’t] pretend this is going to be a real reflection of our abilities.” He is indeed right. Some universities are moving to the idea of ‘open book’ examinations online. In science, these will be problem-based and will test understanding in a very revealing way. When I was a student in the 1970s, I had to sit several of these where time could be taken out to visit a library or bring in books. They were unpopular amongst many students. This is because they often exposed a lack of understanding of the material and were difficult to prepare for. However, we were helped generously with challenging problem-solving tutorials. We also conducted problem-based examinations in Queen’s University Belfast throughout the 1980s but students expressed concerns every year. But we included tutorial support well in advance of the examinations with guides on how to approach a novel problem in advance. The inherent logic being that 'life is like that'. Therefore, Daniel is correct. It is not reasonable for a university to do a ’handbrake turn’ and bring such examinations into play without prior tutorial support about how to approach them.
The impact on students sitting examinations before university.
Earlier this week, a timely article by Dennis Sherwood was posted by the Higher Education Policy Institute (HEPI) that called for teachers to be trusted ‘Trusting teachers is the best way to deliver this year’s exam results – and those in future years?’. The reason is that teachers will find themselves burdened with much of the responsibility for the qualification success of their students. Dennis Sherwood is a consultant running the ‘Silver Bullet Machine’ and his background as a successful scientist and pioneer of business and financial modelling, makes his ideas compelling. Writing for HEPI in January of this year he concluded that as much as 25% of exam grades were wrong with, ‘1 school exam grade in 4 is wrong. Does this matter?’ Indeed, it does matter and in the light of current restrictions on assessments, particularly for students hoping to attend university later this year. As an aside, I had a narrow escape when I dropped a predicted grade in A-Level Physics when I took the examinations a year early at the age of sixteen/seventeen. I found out later that the difference between grades was very small (see TEFS 17th August 2017 ‘A-Level Playing Field or not: Have things changed over time?’ for more of the context). I narrowly missed out on going to medical school a year early. Something that would have been a big mistake for me and any unfortunate patients I might have encountered. By the following year, my career in Biochemistry and Microbiology had emerged as the best way forward.
Trying to assign grades fairly.
Furthermore, “To produce this, teachers will take into account a range of evidence and data including performance on mock exams and non-exam assessment – clear guidance on how to do this fairly and robustly will be provided to schools and colleges. The exam boards will then combine this information with other relevant data, including prior attainment, and use this information to produce a calculated grade for each student, which will be a best assessment of the work they have put in”.
These moves put considerable pressure back on schools, and particularly onto teachers, to make important decisions that will impact greatly on the lives of their students. However, there is some encouragement in the statement that, “We will also aim to ensure that the distribution of grades follows a similar pattern to that in other years, so that this year’s students do not face a systematic disadvantage as a consequence of these extraordinary circumstances”. But there is no detail yet and it remains to be seen if this can be achieved in time.
It might be possible to ‘normalise’ the results across various schools by comparing results in previous years. This would be a significant task for any exam board and would have to be coordinated across the various boards. They might want to collectively consider the report of the Sutton Trust in 2017 that concluded ‘Poorer pupils at a disadvantage over A-level grade predictions’. This is now a well-established phenomenon that infects the whole system. Simply put, students are disadvantaged when applying for university when they later find that their predicted grades were lower than their actual grades in the examinations. Setting grades this year, on the evidence of inaccurate predicted grades, is likely to greatly suppress their achievement unless some other factors come into play. It is a monstrous problem for all concerned, but it must be addressed. The full findings of the Sutton Trust are in the report ‘Rules of the Game’ that has been strikingly influential in promoting the idea of contextual admissions and this should be compulsory reading for all examination boards and universities.
Progress across the UK so far.
Despite the government’s aims, it seems that the various examination boards have yet to reveal the details underneath their objectives. They might not have been prepared for this eventuality. A strange position to find themselves in as surely contingency measures were planned in advance. In England, AQA, OCR, and EdEXcel, explain that they have much to do. In Northern Ireland, CCEA go a little further in stating that they will “make sure grades are fair across schools and colleges”. This perhaps indicates that they might be trying to normalise grades across schools and colleges.
In Scotland, SQA is in a similar position and they are posting updates of progress on an almost daily basis. However, earlier this week the First Minister had to step in and SQA quickly reported that “given the latest changes in public health advice, no young person with SQA coursework to complete should attend school to do so. We will provide further guidance on the completion of coursework and details of our approach to certification as soon as we can”.
In Wales, CJEC/CBAC spotted a similar problem and stated that “Given the timing of the school closures, not all learners have been provided with the opportunity to complete their internally assessed NEA work and controlled assessments. To ensure fairness for all learners, WJEC will not moderate any NEA units or controlled assessments for GCSE, AS/A level and Skills Challenge Certificate qualifications for the summer 2020 series”.
The conclusion is that there is no easy way out for examination boards and universities. The plea here is that equality and fairness are sitting in full view at the top of the priorities. Also, they are each open and clear in their methodology. Back in 1972, when I had my narrow escape from almost entering a medical school, the methods were totally obscured from view. Things seem to have moved on and I hope that, whatever happens, the class of 2020 are dealt a kind hand in the game of fate.
Silver Bullet www.silverbulletmachine.com
Might this be more easily achieved at schools, in contrast to HE?
School teachers are, in general, in regular contact with students for several hours each week, over two years or more, and so have considerable knowledge not only about each individual student, but also about each student’s standing relative to the peer group – so helping the judgement of a fair rank order, which I think must precede any attempt to assign grades. That said, an important problem-to-solve in any valid process is to ensure that no student fears “the teacher has it in for me” or “[Chris] is bound to get a good grade – but I’d rather not be ‘teacher’s pet’!”
In HE, by contrast, lecturer-student interaction can be much more sporadic, and the assessor has only limited information. Any corresponding judgement is therefore based on much weaker evidence.
And let me note that the importance of having trust in, and respect for, the assessor is not solely a requirement in schools and HE. I remember vividly an annual appraisal I experienced (if not suffered) when I was a partner in a large consulting firm. My assessor had no idea what I had been doing the previous year, no relevant information at all. But that didn’t prevent the passing of judgement