Research, GCSEs and Falling Standards…

Yesterday saw the publication of a detailed and thoughtful piece of research by Professor Coe of Durham University http://www.cem.org/attachments/publications/ImprovingEducation2013.pdf . It bravely offered the view that instead of standards having risen steadily over the past twenty years or so, it seemed that they had, at best, held steady. He offered an impressive set of data with clear analysis and set out a demanding challenge for the education system. People who know my blog may be expecting me to launch into a criticism of this work, but to do so would be arrogant in the extreme. Professor Coe has spent a considerable amount of time on it, and I am spending half an hour on my response. The quality will be reflective of that commitment, so please bear in mind that these are initial musings and thoughts. What the research does, quite brilliantly, in my opinion, is to step forward as the child in the famous story and ask the emperor where his clothes are.

There are, however, some problems niggling at me. They not in the data, but in the conclusions drawn from the data, and this is where I’m focusing my attention. I wonder if there has been a tendency to attempt to find cause and consequence from a set of correlating figures which are not entirely supportable. This is not to say I am denying the problems – indeed, the information is compelling, but that the conclusions may be more complex than suggested.

Grade Inflation

For those of us who recently read the Oxford University analysis of GCSE examinations, Professor Coe’s findings may seem confusing and contradictory. In fact the two pieces of research had different objectives and points of focus. Oxford explored content and found that on the whole, the examinations were not easier and that evidence of grade inflation could just as easily be explained by improvements to teaching and learning, better access to information and general attempts to improve the quality of school resources and facilities. Professor Coe’s research does not delve into the content of the GCSE.

Instead he asks (quite rightly) the question that if GCSE grades have improved so dramatically over the past few decades, then why have there not been corresponding rises in PISA/TIMSS data and in impact on the economy? He refers to research conducted by Hanushek and Woessmann (2010) who claim that a 25 point rise on the PISA scale would equate to a £4trillion increase in England’s GPD. I can’t even begin to imagine how that is worked out, but it sounds like a lot. Accordingly, had the rise in GCSE pass rates transferred into PISA test results, we’d all be sitting on yachts, sipping champagne. So what is wrong?

Well, Coe points to grade inflation. Oxford offers some doubt about this. I don’t think it matters either way, because if GCSEs are not impacting on the future success of our young people or the economy, then what is the point in having them? I wonder if, instead of GCSEs being an indicator of future success, they are in fact, actually simply a barrier.

Let me explain. Coe points to PISA and YELLIS as points of evidence that there is not corresponding competency at GCSE – i.e. that success at GCSE does not match the ability identified by YELLIS or PISA. Let’s take YELLIS to start with. No-one preps kids for YELLIS. It is not in their interests to do so. If kids do badly on YELLIS then well on GCSEs, the contextual value added looks better. PISA tests don’t focus on the same kinds of knowledge and understanding as GCSEs do. In fact their purpose is to sort out who can and cannot apply knowledge in ‘novel’ situations according to the OECD’s Andreas Schleicher. And here I think we may have the crux of the problem. GCSEs do not test the skills that might lead to such a significant impact on the economy. PISA tests may.

So…I wonder if an alternative interpretation of the data might be that while we have become much more adept at training children to leap through the hoops of GCSEs, we have not actually made them capable of the kinds of thinking or habits of mind that would make them thrive beyond school. I wonder if the GCSEs are in fact useless. Not that passing them has not been an achievement for children or their teachers, or that they haven’t involved enormous effort, but that perhaps the effort has been misplaced. I have written many times of how I sometimes want to weep when I see lessons given over to exam technique. Two minutes on this, ten minutes on that. We have become superb at getting children through GCSEs – this is not grade inflation, this is grade distraction.

Coe’s research is one of the most important documents I have read in some time. His ideas for developing CPD are interesting. He points out, rightly that there are some issues with consistency in applying methods which are thought to work, like Assessment for Learning. This inconsistency has also been documented by Sue Swaffield at Cambridge. Both of these issues should be addressed. But I ask whether or not we should look at this entire research in a different way – as an opportunity to radically reconsider what kinds of assessments would best suit our children. What forms of application are most appropriate? What do we need in order to build a better future?

Right…half an hour is up. I welcome comments and ideas.

26 thoughts on “Research, GCSEs and Falling Standards…

  1. I think you have raised some important points (as did Prof Coe, obviously). Whenever I see graphs and data, though, I am reminded that interpretation of it is everything (I’m a scientist and work with lots of different data all the time). What does the data mean? Always more difficult to answer than we feel it should be. Data in education is also especially messy; systems heavily dependent on people are always very difficult to untangle with respect to cause and effect. What does this data show? You would have to delve into the detail of it in great detail just to start to get any idea. But, the bigger question of what use are the GCSE’s or Alevels or university degrees or Masters, PhD’s etc.?

    For example, Ha-Joon Chang pointed out in his recent book about capitalism that there was a poor correlation between the number of people attending University and how well the economy was doing. Counterintuitive. Surely the education system has an important impact on how well a society’s economy works, and how successful it is in the global economy? I hope so. But maybe not in the way we think it does. Education should probably be rigorous and a preparation for life, but exactly what that means is not settled, I think. We should invest heavily in education and continually seek improvement; that should be our aim, even if we aren’t sure of how to do that. Intentions are very important. So, I very much welcome the current seriousness of the debate….I think that our determination to have that debate is as important as the debate content itself…..but we will almost certainly not figure out the absolutely right answer….the system is too messy for that…..

  2. While watching people criticising Gove over GCSE reforms by commenting that ‘weighing the pig doesn’t make it fatter,’ I’ve regardless often wondered what would happen if the GCSE were replaced by the UKMT – surely impossible to game… would the necessity to develop relational mathematical understanding propel us forward in our pedagogy, and would those exams, for the first time, be sufficient to assess it?

    1. Would love to see responses to this question Kris. I have no idea. Is it impossible to teach to the UKMT – I mean teaching to the test? Don’t know it at all.

  3. Thanks. I doubt there will be any that I can answer. Is harder better though? What are we testing for? I wonder what your views are on the idea that functional skills with an emphasis on stats and data should be the norm with extension work for those with an interest or aptitude? I have no opinion one way or another, but just interested.

    1. It’s not *that* they’re harder, it’s *why* they’re ‘harder’ that’s important. They’re harder often because they demand real mathematical understanding. In terms of NC levels, I’ve seen problems that can be solved using nothing more than Level 6 content, and yet the nature of the problem is so unusual that I would argue it is more challenging, in terms of the complexity of thinking required, than most A* questions. But then, I suspect that ‘thinking,’ even problem solving, is a function of knowledge, and so eminently teachable; in fact, my experience of meta-thinking about what was going on in my mind when I tackled UKMT papers is part of what sold me on this idea.

      The stats thing… Arthur Benjamin, whom I *love*, makes the case for it here:

      http://www.ted.com/talks/arthur_benjamin_s_formula_for_changing_math_education.html

      Judith Grabiner, in this remarkable series of lectures:

      http://www.thegreatcourses.com/tgc/courses/course_detail.aspx?cid=1440

      has convinced me all the more just how deficient my mathematical education and understanding had been with regard to statistics, and without even trying, she makes the case for their importance.

      Similarly Hans Rosling has done some remarkable work bringing statistics to life, fitting that it took a Swede to do so.

      But here’s the thing… if you’re looking for mathematics that people can take away with them and actually probably make use of in everyday life, then yes, stats is it, and I think there’s a strong case for it.

      However, statistical modelling is itself built upon an understanding of algebraic modelling, and therefore of course number, and geometry (since most advanced algebraic modelling is analytic geometry, a combination of the two). So… I wonder how far you can go with statistics without understanding its precursors. But let’s assume you can go very far, far enough, there is still the problem of what would be lacking. As well as mathematical knowledge, what we should hope people take with them from mathematics is problem modelling, logic and proof. These are not necessarily taught much at all in schools, although *technically* they are a part of the curriculum. If we want to bestow the gift of mathematical knowledge, I’m not sure that statistics is enough, nor that its focus is where we’d be best served; I feel that we should improve on our originally intended curriculum. I do, however, think we can and must vastly improve how we teach statistics, and look forward to figuring out how, eventually.

      1. Kris – Stats can go pretty far without the algebra and geometry behind it. If someone had made me learn all that *first*, there’s no way I would have got as far as I did with stats (the only branch of maths that ever really made sense to me). However, earlier this year I had to learn the algebra behind the statistics and doing it that way around really helped. Because I love statistics, and get it, I was actually interested in the algebra/geometry behind it. Hence it might be that by having stats earlier it can actually help some learners understand those other concepts better at a later date.
        (Sorry Debra for hijacking the comments on maths chat, but thought this was worth pointing out!)

  4. Debra – on Hanushek & the 4 trillion…you’re right to be suspicious of this work…Hanushek is an economist and hences writes papers which are tight mathematically but which extrapolate without due regard for the consequences of policies in the “real” (i.e. non-rational) world. The idea that by educating people better you would suddenly get an extra 4 trillion dollars into the country is, clearly, a leap of maths-faith.
    As for the point that you are making on Coe, I thought the same thing. We have made students better at GCSEs, but this has not made them better at the skills that underlie PISA. We now have to ask if this matters: do we trust the PISA test setters to be uncovering the sorts of skills/knowledge we want students to have more than we trust the GCSE setters? Or, as Kris suggests, are there alternative tests which might do an even better job of testing the sorts of skills/knowledge we wish young people to have.

    1. I thought as much – thank you for clarifying. I wonder if there is also a ‘horses for courses’ approach. Whether for some subjects, testing is most appropriate, but that for others, different modes of assessment are more reliable – for extended writing, analysis or research skills perhaps? Whatever the answer, it is clear that a debate needs to be taking place. Am I right in thinking that the very question of even testing at all is being explored in Finland at the moment?

  5. I’m very glad that my academic career from 16 onwards wasn’t dependent on passing this test. Without a Maths qualification, what jobs can you do? If this was to become the test, I’d hope that a Maths qualification would cease to be a prerequisite to studying, say, English and Russian literature at University! Otherwise, I like the point. It’s food for thought!

  6. Thanks for this Debra, and it’s been very interesting to read the comments as well.

    I was always a ‘hopeless’ mathematician at school. It’s not my subject, I have a bit of a blank with it and struggle to retain anything involving numbers. I had some teachers who thought that maths was for boys not girls. They convinced me. I managed a B grade at O Level but have not really used it since.

    Until … I started to help run my local preschool, and suddenly I was having to deal with cashflow, accounts, spreadsheets, funding calcualtions, staffing costs, calculate employer’s NI, and all kinds of mathematical operations. And funnily enough I suddenly found that (with some support) I could actually do it. So, although it took me until my early forties, and although I will never be a genius at the subject, when I needed to use my basic maths knowledge I could.

    I’m not sure that this adds anything at all to your discussion, beyond the thought that sometimes people just aren’t ready for a subject at GCSE age, or that they are only ready for it if they are shown its relevance in their lives. When you’re a kid, it’s often that there’s no IMPERATIVE to learn this stuff, beyond an adult droning on at you to do it.

    I am with you on the notion of a portfolio of learning to show what a child can do, particularly in those subjects that use extended writing, or extended exercises of any kind (for instance, art). Why not celebrate success rather than testing to see who ‘can’t cut it’?

    1. Thank you Sue – I was like you too – and struggled to get my O Level grade B – though I seem to remember liking something called Calculus and another thing called Differentiation. I can’t remember what either were now! My adult maths use has been about balancing budgets, managing data spreadsheets and trying to figure out how reliable statistical evidence is in research. Oh, and ordering carpet, converting metric and imperial measurements and working out how few rolls of wallpaper I can get away with buying! I wonder if we need to think about pathways for children who will take functional courses and ones who genuinely want to explore what Kris calls ‘proper’ maths. Sort of a core ‘functional certificate’ that all children should aim to achieve and then further study in a range of subjects as a choice. I know…. a minefield….

      1. Having taught functional maths it actually seemed that many students find it even harder than ordinary maths. Both you and Sue were actually in your contexts when you needed to use maths in that way so it made sense to you – the problem of functional maths is that it often requires ‘grown-up knowledge’ about the way the world works (i.e. knowing what wallpaper is, which can be very uncommon in certain parts of the country).

        1. Hmmm true – so context building is key. That’s why I love Mantle of the Expert as a pedagogy though because that’s exactly what you do. You ask the question ‘Who in the world needs to know this stuff?’ and then you set up an in-role enterprise to become that group of people working on a problem.

        2. I understand the point about context, and context setting to answer the question, why bother with this? But I think it is a red herring. Possibly we can create the illusion of context (which is what MoE does), and sometimes just introducing the veneer of a ‘game’ is enough….but, really, schools should be places where children want to tackle everything that is set before them because they deeply know that this stuff is good; an inherent respect for education (is this partly a society failure that signals to kids that perhaps the education system is a waste of time….)? Don’t know the answers….

      2. Laura – I’m open to that. There’s also a tendency for people to advocate domain topics based on their personal preference or experience. I think here we need to be very careful – your experience of other parts of mathematics may have been very poor, hence not liking them. Your personal experience of stats working out well, it doesn’t follow that the order in which you learnt these things is the most optimal. There’s not necessarily anything to suggest the converse in my mind either yet though – that’s an analysis to be performed.

    2. Sue, it feels like there’s something going on here that I see often in adults who had a poor experience of mathematical education; it worries me quite a bit.

      “I’m not sure that this adds anything at all to your discussion, beyond the thought that sometimes people just aren’t ready for a subject at GCSE age, or that they are only ready for it if they are shown its relevance in their lives. When you’re a kid, it’s often that there’s no IMPERATIVE to learn this stuff, beyond an adult droning on at you to do it.

      I am with you on the notion of a portfolio of learning to show what a child can do, particularly in those subjects that use extended writing, or extended exercises of any kind (for instance, art). Why not celebrate success rather than testing to see who ‘can’t cut it’?”

      In your first paragraph, what you describe just sounds to me like the result of bad teaching! It reminds me of this fairly well known cartoon:

      http://remixingcollegeenglish.files.wordpress.com/2012/07/testing_cartoon.jpg

      Most people interpret that cartoon to mean that our assessment methods are unfair – clearly the monkey will succeed, and everyone else will fail. As you echo, the belief is that our method of assessment encourages and highlights failure, rather than highlighting people’s achievements and successes.

      Ironically, I’ve derived a very different meaning from it. First, if what you want to assess is ‘tree climbing ability,’ then this would be an exceptional method of assessment! So why is it unfair? Well, poor goldfish, clearly he’s never going to make it to the top! Monkey is destined to way; her nature is perfectly adapted to climbing. If we are to accept the analogy, then our assessment methods are unfair because some people are monkeys, and some people are fish, and the fish are never going to win. In other words, ‘some people just can’t do maths.’ To me, that sounds worryingly like Gerson’s killer line ‘…the soft bigotry of low expectations.’ I don’t see any child as a fish, who will never learn maths. I don’t think an A* is a particularly high achievement after 11 years of mathematical education, and by that I don’t mean ‘it’s easy for me,’ I mean I think it’s eminently achievable for almost everyone, with the caveat that they be taught exceptionally well for all 11 years, in a calm, safe, nurturing, well disciplined school environment.

      The solution is not to give up and assume that some kids will just never be cut out for it by GCSE age; it’s to improve our national system of mathematical education until *everyone* leaves aged 16 with an A*. Fine, by then we’ll invent a new higher standard to sort everyone into their respective bell-curve categories, but A* as it currently stands would leave everyone with a good level of mathematical knowledge and ability.

      People who struggle with the maths exam are not fish. They are monkeys who never learnt to climb.

      1. “People who struggle with the maths exam are not fish. They are monkeys who never learnt to climb.” < Couldn't agree more! Profoundly and correctly put.

  7. This seems to go to the heart of what “education” is for, to learn how to learn, using and applying what has been learned in real contexts, or learning enough to “pass” the exam, with much prep to do so.

    We need an education system that values both elements and finds a way to examine effectively. As you have said, some subjects are easier to quantify than others.

    I would see an argument for training all teachers to examiner level, so that they can judge their own and other teacher outcomes. Examine maths, aspects of science, specific “knowledge” elements of subjects, then moderate Teacher assessed English, art and so on?

    1. Ooh, interesting thought that Chris. Of course in many European countries, the teachers ARE the examiners – they are trusted to mark their own students work. But to train them all to that level and then cross moderate might be a really interesting idea. There used to be an old GCSE course like that called the Leicestershire mode 3 course – moderated and marked through consortia of teachers.

      1. I have had this thought for some time. Training whole workforce to examiner level, over time, could really improve education, as thinking and diagnosis deepen.
        Would cost at the start, but might save significantly over time.
        Teacher examiners would also speed up the systems.

  8. Hi – really liked your blog and the way you had written it. I am not a statistician but was concerned by his methodology. Vis:
    Take two bits of data which use completely different methodologies and collect completely different evidence.
    Lay one on top of the other.
    Say one must be false.
    (You get to pick the one you say is false)
    In his paper he did this twice.
    GCSEs are a highly complex assessment with a massive accountability structure. The other data sets he uses are less complex and less consistent over time.
    The essential question – are these data sets all describing the same thing? – is not examined.
    What annoyed me though, was the facetious and sarcastic tone of his address, particularly fig.6. It contained inherently spurious arguments. E.g. ‘Do nothing and things will probably get better.’ ‘Don’t address the real problems and they will disappear.’ Despite these propositions his fundamental argument was that things hadn’t got better.

    1. Thank you. Yes, and I think your comments unearth for me a deep seated unease with the calls for ‘evidence based research’ in education. Whose evidence? Used for what ends? Interpreted in which ways. No data is innocent or unloaded.

  9. Falling standards? If we want this country to do better in the international comparisons then we’d better learn quickly from what the very best systems are now doing – and take note of what they no longer do. As we’ve been pointing out for some time, Singapore long ago began to introduce the Finnish ‘model’, and now the same is happening in China, following the successful ‘pilot’ in Shanghai Province these past five years.

    The CBI has said it wants school leavers who are proficient in literacy and spoken English, and in ‘everyday’ arithmetic and numeracy. To this end the CBI has said it wants to see 16+ examinations completely abolished, in order that these key skills plus personal and social skills can be better developed. The CBI knows very well that there is no equivalence between good performance in exams and young people’s ability to apply knowledge in the work place. The CBI’s members are quite content to provide specialist training for essential workplace skills – what they need are young people who possess high levels of every kind of intelligence, plus creative aptitudes and an enthusiasm for lifelong learning. Consider how many of us have passed GCSE French and yet can’t hold a conversation with a French speaker. It’s a similar thing. We remember well learning trigonometry and quadratic equations, and never finding a single use for them in real life. (http://3diassociates.wordpress.com/2012/11/29/international-comparisons-enlightened-education/)

    The rest of the world (with the exception of the USA and South Korea) is shifting to a new 21st Century paradigm of teaching and learning. Clearly Mr Gove and his various cheerleaders intend to cling on to the 19th Century approaches they know and love. Meanwhile our children and young people are not only losing out – their wellbeing is increasingly at risk in a system where ‘standards’ are continuously ‘driven up’ to meet the demands of politicians and anxious parents who see high exam grades and places at prestigious universities as the be-all and end-all of education.
    GF

Leave a Reply to Chris Chivers (@ChrisChivers2)Cancel reply