Proof that GCSE are more rigorous than O Levels?

24 June 2012

I spent an interesting morning at Television Centre today, appearing on Broadcasting House, the Sunday morning magazine show hosted by the affable Paddy O’Connell. Taking a light-hearted look at the current O Level and GCSE debate, he sat an English O Level question and a GCSE one; they were both ‘writing’ or composition questions. As a former exam boarder marker and veteran of marking GCSE scripts for twenty years, I was called in to assess his work. It was really interesting. The O Level question asked candidates to write about someone who thinks they are better than they actually are, and the GCSE question was to analyse the importance of listening. Paddy turned in a creditable O Level answer but spelt the noun form of ‘practice’ incorrectly, using ‘s’ instead of ‘c’, and somewhat inexplicably spelt ‘lily’ incorrectly! He clearly knew how to spell the word — he gave the correct spelling after the show to me — but must have suffered under the stress of exam conditions. His answer was funny — he owned up to being the person who thought he was better than he actually was! The interesting thing about marking his O Level script was that there was no assessment criteria to hand except the examiner’s report from that year. I had to improvise marking the essay; it was clearly an A grade; the tone, the paragraphing, the structure of the essay was, as you would expect, good, and, in the absence of any clear criteria, I had to give Paddy a good mark. Having spoken to O Level markers, I have no doubt that this is how O Levels were marked. You have to remember that only 20-30% of students took O Levels; this meant you could have a smallish pool of examiners, many of whom knew each other and spoke the same ‘old boy’ language. The discourse of the examiner’s report was like listening to some old codger watching the cricket at Lords with a glass of Pimms in his hand, talking about “slapdash” standards, too many “rapes” in one question, complaining about girls using circles above their ‘i’s and stating boldly without any quantitative evidence that standards were slipping. Plus sa change?

Marking the GCSE essay was both easier and harder. The marking criteria was very clear; for writing it’s divided into two main areas, Assessment Objectives i and ii, which were marked out of 20, and Assessment Objectives, iii, which are marked out of ten.

To get a top band answer, you have to meet these criteria for AO i and ii:

In this band a candidate’s writing:
 shows sophisticated control of the material and makes  effective use of linguistic devices.
 demonstrates a sophisticated understanding of the task,  addressing it with complete relevance and adapting form and style with flair to suit audience and purpose.
 uses precise vocabulary which is fully suited to the purpose of the writing, conveying subtlety of thought and shades of meaning, and where appropriate is imaginative and ambitious in scope.
 uses structure to produce deliberate effects, developing  the writing coherently and skilfully from a confident opening which engages the reader to a very convincing and deliberate ending.
 is organised into coherent paragraphs which are clearly varied for effect and used confidently to enhance the ideas and meaning.

For AO iii, you have to meet these criteria for a top band answer:

In this band a candidate’s writing:
 uses a wide range of sentence structures to ensure clarity and to achieve specific effects relevant to the task.
 uses ambitious vocabulary with very few spelling errors.
 uses punctuation consciously and securely to shape meaning, with very few errors.

Unfortunately, Paddy’s answer just a little wayward and all over the place, though full of good ideas. With some teaching, he would achieve a top band answer, but his answer went off topic and lost focus in places, and there’s no doubt in my mind that he would have not achieved an A* as one would expect, but actually get a low A grade. It made me realise that at the top end, GCSE is a demanding exam; to really excel, you have to be pretty darn good! It also brought home to me — and Paddy — just how terrifying these exams are; Paddy confessed to being nerve-wracked and quite stressed by it. He took the GCSE question after the O Level question and it showed; he’d lost a bit of concentration. Many of my students  were taking two or three exams in one day; I’ve noticed time again that my classes’ results are worse if they’ve taken an exam in the afternoon. The poor things are exhausted!

I’m just not sure that these exams are an accurate test of ability, though there’s no doubt in my mind that GCSE English is a far more rigorous exam than the old O Level.

2 comments

  1. I find it amazing that you can write this in all seriousness.

    What on earth is the point of a grading system that gives A* and a “low A” grade?

    If you want any evidence that the exam system has not been devalued then there is your evidence. Why then would you need to develop a grading structure within a grading structure otherwise?

    The rest of this example only goes to demonstrate that all that is required for effective “O Level” marking is some updated examiners notes.

    Hardly reason to be negative about the exam itself…..unless you have a clear agenda to push?

    from Andy H
  2. Not sure that Andy H actually understands how grading systems work. Generally the alphabetic grades relate to a percentage range e.g. D = 40-49, C = 50-59, B = 60-69 etc. Until the introduction of A* the rage for an A was 30% points from 70-100 but even with A* it is still typically 80-100%. Putting it simply the alphabetic grading system is not sensitive enough, hence the A* and low grade A. Percentage systems make it much easier to be more sophisticated when marking.

    from JW

your comment

Blog