COMMENT
SecEd‘s Daniel White reported on his first exam results day this
summer. Here is his take on the media circus that always surrounds the publication of GCSE and A level results
E
VERY AUGUST, the media circus around the publication of GCSE and A level exam results leads to all manner of headlines, often linked to league tables and accusations of a lowering of standards. Yet how does it all happen behind
the scenes? How does the media decide how it will report the results? And why do the same old headlines always seem to appear? This was my first ever experience of results season.
Being aged 22, the only side I had ever seen of results days, GCSEs or A levels, had been when I was opening my own envelopes (I remember that there were headlines about “dumbing down” then as well). So on A level results day this year, as I headed
down to the London HQ of the Joint Council for Qualifications (JCQ) – where the results are unveiled to the nation’s press – I did not know what to expect. I was bleary-eyed given that it was a 7:30am start,
but I was also intrigued. I was about to be “locked in” by the powers that be as they prepared to reveal what the results had yielded. We were to be told a full two hours before they were officially published in order to help us prepare our copy in time. And so at 7.30am, under strict instructions that
nothing should be broadcast until 9:30am, I was swept politely into a room in the JCQ offices, which are a stone’s throw from Parliament, and handed the results. As I entered, I expected to find journalists on their phones, flicking through papers madly, or running
The exam circus
around trying desperately to find out what percentage of students had gained C-grades in their English literature or some such other subject. But it was nothing of the sort. In fact, there was near
silence. A couple of journalists were checking facts, a nice tray of Danish pastries was in the corner, jugs of orange juice and water and, most importantly, a pile of folders containing all the exam results information for everyone to digest. Graeme Paton, education editor for the Daily
Telegraph, told me that although he approaches both days with an “open mind”, there are common themes that are in the interest of his readers which make it difficult to avoid particular “angles” on their stories. He said: “All journalists, I think, go into GCSE and
A level results season with a genuinely open mind; trying to come up with a fresh way of approaching the subject. But it is very hard to pull yourself away from the familiar themes: have results gone up again? Are kids brighter or are exams getting easier? Are
Demotivating results Psycho babble
A STRAW poll of secondary school teachers and students in my local area confirms the fact that there is discontent about the latest round of exam results – at both GCSE and A level. Last year, more than 160,000 papers were
remarked after schools queried grades and that number looks set to increase this year. The truth is that this system not only produces wildly inconsistent (and often inaccurate) grades for the purposes of assessment, but also undermines students’ motivation and self-belief, not to mention learning. At two of my local schools, whole
classes had their papers remarked and raised, on av erage, up to two full grades; another school attempted the same operation only to be told that the papers had been “lost”. This is an unfathomable situation for young men and women who need to understand the results they have, after much hard work, achieved. There is plenty of research
pointing to the fact that grades that are considered “unfair” serve only to dispirit students and disrupt their approach to learning. Moreover, Gibbs & Simpson
have found that grades without feedback can be particularly damaging, negatively affecting students’ “self-efficacy” or sense of competence. This is important because self-efficacy is strongly related to effort and persistence with tasks (Schunk, 1984; 1985), predicts academic achievement, and is associated with adopting a deep approach to learning (Thomas et al, 1987). In contrast, feedback concerning content provides
the student with options for action and is less closely associated with their ego. Wootton (2002) has written passionately about the negative impact of assessment on “at-risk” students and asks whether the system exists “to encourage learning or to measure failure”. Under the current system, students are presented
with a grade without any firm understanding of why it was allocated. They (or their schools) can pay to have their papers returned or re-marked, but this is a time-consuming process that can damage student
motivation, inhibit progression and even prevent students from acquiring places at university. There is no feedback, and no system for cross-checking results against usual performance, either. This type of “high-stakes” testing is suspect
anyhow. A great deal of research suggests that this strategy accelerates a reliance on direct-instruction techniques and endless practice tests. This fosters low-level uniformity and has the potential to subvert academic potential, according to Dorothy Strickland, a professor at Rutgers University in the US. What’s more, Gibbs & Simpson have pointed out that “the most reliable, rigorous and cheat-proof assessment systems are often accompanied by dull and lifeless learning that has short-lasting outcomes”. They wrote: “We are not arguing for unreliable assessment but we are arguing that we should design assessment, first, to support worthwhile learning, and worry about reliability later. Standards will be raised by
improving student learning rather than by better measurement of
limited learning.” According to a 1987 paper (Chansarkar & Raut-Roy), students tend to gain up to 12 per cent on average higher marks from coursework than they do from examination. Second, coursework marks are a better predictor of long- term learning. Conway et al (1992) reported a study of the performance of psychology students
on a range of tests of their understanding and recall of content of a cognitive psychology course taken many years before. They found that student marks on coursework assignments undertaken up to 13 years before correlated with these test scores while students’ original exam marks did not. There is something deeply flawed about the current
system, which is designed for ease of application rather than providing a genuine assessment of a student’s abilities or, indeed, motivation to continue to learn at a high and creative level.
• Karen Sullivan is a bestselling author, psychologist and childcare expert. She returns in a fortnight.
Last chance to take part in The Real Business Challenge 2011
boys falling further behind girls? Are pupils studying the ‘tougher’ subjects? Are independent schools still performing better than state schools? Not least because these are the issues that consistently generate the most interest and, in one way or the other, touch the most people. “As much as we might try, it always seems to be
the case that, when the press conference on results day starts, the senior examiners and, therefore, journalists themselves always appear to migrate towards these issues.” In the press room, the journalists from the
various national media began discussing the results, commenting on particular aspects. Surprised tones discussed the numbers studying the various subjects – the rise in physics entries was mentioned, questions were being asked about how the subjects in the English Baccalaureate had done, and heads were shaking at yet another drop in the numbers studying modern languages.
Then, stop your talking, pick up your pens – the
JCQ press officer informs us that the press conference is to begin. Around 40 people make their way down the corridor into a room with three people at a top table – Andrew Hall, chief executive of AQA, Mark Dawe, chief executive of OCR, and Ziggy Liaquat, managing director of Edexcel. First off – a brief presentation by Mr Hall going
through the headline figures. The number of 18-year- olds studying A and AS levels and patterns in the subjects. He focuses on maths with its increase in entries of 40 per cent over five years and physics, up by 20 per cent over the same period. The three men at the top know that they have a wonderful opportunity to steer the national media and these messages have been much discussed I am sure. As the presentation ends, the floor is opened up to
questions and the journalists now take the lead: why is there a rise in maths? What is it about science that has led to the increase? What does the future hold for modern languages? It is here where the soundbite that might make the
day’s headlines is likely to come. And it doesn’t take long. As the experts fire back the usual answers about the reaction to employer demand driving STEM take- up, one quote rings out: “It could simply be the Brian Cox effect.” Professor Brian Cox – yes him off the telly; the man
from the band D:REAM who released the single It can only get better. Fast forward 24 hours and the BBC has a feature on the “Brian Cox effect” and throughout the next few days the celebrity professor’s name is to be found in numerous headlines and news reports. All stemming from one, seemingly off-the-cuff comment. The GCSE results day a week later took place in
much the same vein. On this day, Brian Lightman, general secretary of the Association of School and College Leaders, found himself at the centre of a press scrum after “dropping in” to the press room. An astute move by his press team. I did actually feel sorry for Mr Lightman. He
answered four questions from three journalists before more newshounds entered the room and proceeded to ask him the same questions, this happened four or five times. As well as becoming rather long-winded for the general secretary, it also highlighted to me just how much the national media do focus on those familiar issues. Both results days were remarkably similar in the
events that took place and how the days unfolded, how the journalists largely ran with the same angles on their reports, and how just one or two soundbites made the main headlines of the day.
SecEd
A team challenge that develops a broad range of skills for greater employability
Part of Get Set, the offi cial London 2012 Education Programme
A unique opportunity for your students to work alongside Coca-Cola Business Professionals on a Live Project Free to get involved National competition for Year 10s Can count towards a BTEC in WorkSkills Ready to download and run now
Closing date to get work in is 23 September 2011
www.therealbusinesschallenge.co.uk
SecEd • September 22 2011
7
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16