Nicky Rushton

Nicky Rushton

I have worked on a wide range of projects within the Research Division at Cambridge University Press and Assessment. Two of these have been particularly memorable - I was responsible for running the maths research during the GCSE redevelopment work; this involved answering many research questions within a very short space of time. Another memorable project has been the Aspects of Writing project; as part of this, I completed an analysis of candidates’ spelling errors in 2004, 2007 and 2014.

The majority of my recent research has focused on the comparability of examinations, with a focus on comparisons of international education systems. I maintain a database of comparability related research reports, and track the changes made to general qualifications (GCSEs and A Levels) as well as other aspects of the educational system in England. Recently, I have also been involved in a project that investigated how examiners write question papers.

My first degree was a BEd from the University of Cambridge, and I spent two years working as a teacher in a first school in Newmarket. In 2003, I joined Cambridge University Press and Assessment, initially to write maths questions for an online mathematics product, then working in the team that produces the university admissions tests. Since joining the Research Division in 2009, I have completed my MEd at the Open University.

In my spare time, I like to spend as much time outdoors as possible. At weekends, I can usually be found walking, cycling or sailing. I also sing in the university choir.

Publications

2024

How do approaches to curriculum mapping affect comparability claims? An analysis of mathematics curriculum content across two educational jurisdictions

Rushton, N., Majewska, D., & Shaw, S. (2024). How do approaches to curriculum mapping affect comparability claims? An analysis of mathematics curriculum content across two educational jurisdictions. Research Matters: A Cambridge University Press & Assessment publication, 37, 40-56. https://doi.org/10.17863/CAM.106032

Curriculum mapping is a comparability method that facilitates comparisons of content within multiple settings (usually multiple jurisdictions or specifications) and enables claims to be made about those curriculums/jurisdictions. Although curriculum maps have been published, there is little academic literature about the process of constructing and using them. Our study extends the literature by considering the different types of comparisons that can be made from curriculum maps: content coverage, placement, depth, and breadth. We also consider how these comparisons are affected by structural differences in the curricula or using a sub-set of the content.

We use our mapping of mathematics in the US Common Core State Standards (CCSS) and the national curriculum in England to explore this. The CCSS for mathematical practice are common to all grades; we mapped these standards against the content for individual years in the national curriculum. The CCSS for mathematical content are set out by grade; we mapped a subset of this content to the national curriculum.

Our mapping shows that it is possible to map curricula and make meaningful comparisons despite structural differences and content limitations. However, this affected the types of comparisons that we could carry out and the claims that we could make.

Research Matters 37 : Spring 2024
  • Foreword Tim Oates
  • Editorial Tom Bramley
  • Extended Reality (XR) in mathematics assessment: A pedagogical visionXinyue Li
  • Does ChatGPT make the grade?Jude Brady, Martina Kuvalja, Alison Rodrigues, Sarah Hughes
  • How do approaches to curriculum mapping affect comparability claims? An analysis of mathematics curriculum content across two educational jurisdictionsNicky Rushton, Dominika Majewska, Stuart Shaw
  • Exploring speededness in pre-reform GCSEs (2009 to 2016)Emma Walland
  • A Short History of the Centre for Evaluation and Monitoring (CEM)Chris Jellis
  • Research NewsLisa Bowett

2023

Covid-19-related changes to upper secondary assessments in six countries: Adaptations and reactions

Rushton, N., & Lestari, S. (2023, November 1–4). COVID-19 related changes to upper secondary assessments in six countries: Adaptations and reactions. [Paper presentation]. Annual conference of the Association for Educational Assessment – Europe (AEA-Europe), Malta. https://2023.aea-europe.net/ 

2022

How did we get here? Timelines showing changes to maths education in England and the United States
Majewska, D. Rushton, N. & Shaw, S. (2022). How did we get here? Timelines showing changes to maths education in England and the United States. Cambridge University Press & Assessment Research Report. Cambridge, UK: Cambridge University Press & Assessment
Tracing the trajectory of mathematics teaching across two contrasting educational jurisdictions: A comparison of historical and contemporary influences.
Shaw, S. D., Rushton, N., & Majewska, D. (2022). Tracing the trajectory of mathematics teaching across two contrasting educational jurisdictions: A comparison of historical and contemporary influences. The International Education Journal: Comparative Perspectives, 21(1), 41-60.
Register of Change Part 2: 2010-2021
Rushton, N. & Ireland, J. 2022. Register of Change Part 2: 2010- 2021. Cambridge University Press & Assessment Research Report. Cambridge, UK: Cambridge University Press & Assessment.
Register of Change Part 1: 2000-2010
Rushton, N. 2022. Register of Change Part 1: 2000-2010. Cambridge University Press & Assessment Research Report. Cambridge, UK: Cambridge University Press & Assessment.
COVID related consultations 2020-21
Rushton, N. 2022. COVID-19 related consultations 2020-2021: A register of Change supplementary document. Cambridge University Press & Assessment Research Report. Cambridge, UK: Cambridge University Press & Assessment.

2021

What do we mean by question paper error? An analysis of criteria and working definitions

Rushton, N., Vitello, S., Suto, I. (2021). What do we mean by question paper error? An analysis of criteria and working definitions. Research Matters: A Cambridge University Press & Assessment publication, 32, 67-81.

It is important to define what an error in a question paper is so that there is a common understanding and to avoid people’s own conceptions impacting upon the way in which they write or check question papers. We carried out an interview study to investigate our colleagues’ definitions of error. We found that there is no single accepted definition of a question paper error. There were three interacting aspects that participants considered when deciding whether a problem was an error: the manifestation of the error, the (potential) impact upon candidates, and the stage at which it was discovered.

Research Matters 32: Autumn 2021
  • Foreword Tim Oates
  • Editorial Tom Bramley
  • Learning during lockdown: How socially interactive were secondary school students in England? Joanna Williamson, Irenka Suto, John Little, Chris Jellis, Matthew Carroll
  • How well do we understand wellbeing? Teachers’ experiences in an extraordinary educational era Chris Jellis, Joanna Williamson, Irenka Suto
  • What do we mean by question paper error? An analysis of criteria and working definitions Nicky Rushton, Sylvia Vitello, Irenka Suto
  • Item response theory, computer adaptive testing and the risk of self-deception Tom Benton
  • Research News Anouk Peigne

2019

Towards a method for comparing curricula
Greatorex, J., Rushton, N., Coleman, T., Darlington, E. and Elliott, G. (2019). Cambridge Assessment Research Report. Cambridge, UK: Cambridge Assessment.
A culture of question writing: Professional examination question writers’ practices
Johnson, M. and Rushton, N. (2019). A culture of question writing: Professional examination question writers’ practices.  Educational Research, 61(2), 197-213.

2018

What lessons from current working practice can be applied to big data? Identifying GCSE equivalents across many jurisdictions
Rushton, N. (2018) What lessons from current working practice can be applied to big data? Identifying GCSE equivalents across many jurisdictions. Presented at the 44th conference of the International Association for Educational Assessment, Oxford, UK, 9-14 September 2018.

2017

Is the General Certificate of Secondary Education (GCSE) in England incongruous in the light of other jurisdictions’ approaches to assessment?
Elliott, G., Rushton, N. and Ireland, J. (2017). Presented at the 18th annual AEA Europe conference, Prague, 9-11 November 2017.
A culture of question writing: How do question writers compose examination questions in an examination paper?
Johnson, M. and Rushton, N. (2017). Presented at the 18th annual AEA Europe conference, Prague, 9-11 November 2017.
Popular perceptions about the comparability of assessments in England. A tension between academia and the mainstream broadcast and print media?
Elliott, G. and Rushton, N. (2017). Presented at the 18th annual AEA Europe conference, Prague, 9-11 November 2017.
Spelling errors in 16-year-olds’ writing
Rushton, N. (2017). Presented at the annual conference of the British Educational Research Association, University of Sussex, Brighton, UK, 5-7 September 2017.
Developing a framework for coding English students’ spelling errors
Rushton, N. (2017). Presented at the annual European Conference of Educational Research, University College Copenhagen, Denmark, 22-25 August 2017.

2016

Research Matters Special Issue 4: Aspects of Writing 1980-2014
  • Variations in aspects of writing in 16+ English examinations between 1980 and 2014 Gill Elliott, Sylvia Green, Filio Constantinou, Sylvia Vitello, Lucy Chambers, Nicky Rushton, Jo Ireland, Jessica Bowyer, David Beauchamp

2015

Do experts’ views of specification demands correspond with established educational taxonomies?
Greatorex, J., Rushton, N., Mehta, S. and Grayson, R. (2015). Do experts’ views of specification demands correspond with established educational taxonomies? Online Educational Research Journal. (Advance online publication).
Teachers’ and employers’ views on the transition from GCSE Mathematics to A level Mathematics or employment

Rushton, N. & Wilson, F. (2015). Teachers’ and employers’ views on the transition from GCSE Mathematics to A level Mathematics or employment. Research Matters: A Cambridge Assessment publication, 20, 21-27.

Mathematics is one of the core GCSE subjects, and students are required to study the subject until the end of Key Stage 4 (KS4), when they are approximately aged 16. There is no requirement for students to take a qualification in Mathematics, but almost all students do. GCSE Mathematics is important because it represents the end of students' compulsory Mathematics learning. The current study aimed to identify the areas of Mathematics that were problematic for students who had just completed GCSE Mathematics. It also aimed to discover whether there was any overlap in the skills that were considered to be problematic as preparation for A level and those considered to be problematic as preparation for employment. It uses responses from a larger survey of teachers and employers to consider three research questions: 1. What areas of Mathematics are GCSE students well/poorly prepared in? 2. What teaching is needed to bring students up to the standard for starting A level Mathematics? 3. What Mathematics training do employers run for school leavers?

Teachers’ and employers’ views on the transition from GCSE Mathematics to A level Mathematics or employment

Rushton, N. and Wilson, F. (2015). Teachers’ and employers’ views on the transition from GCSE Mathematics to A level Mathematics or employment. Research Matters: A Cambridge Assessment publication, 20, 21-27.

Mathematics is one of the core GCSE subjects, and students are required to study the subject until the end of Key Stage 4 (KS4), when they are approximately aged 16. There is no requirement for students to take a qualification in Mathematics, but almost all students do. GCSE Mathematics is important because it represents the end of students' compulsory Mathematics learning. The current study aimed to identify the areas of Mathematics that were problematic for students who had just completed GCSE Mathematics. It also aimed to discover whether there was any overlap in the skills that were considered to be problematic as preparation for A level and those considered to be problematic as preparation for employment. It uses responses from a larger survey of teachers and employers to consider three research questions: 1. What areas of Mathematics are GCSE students well/poorly prepared in? 2. What teaching is needed to bring students up to the standard for starting A level Mathematics? 3. What Mathematics training do employers run for school leavers?

Assessment, aim and actuality: insights from teachers in England about the validity of a new language assessment model
Johnson, M., Mehta, S. and Rushton, N. (2015). Pedagogies: An International Journal. 10(2), 128-148.
Are claims that the GCSE is a white elephant red herrings?
Elliott, G., Rushton, N., Darlington, E., and Child, S. (2015). Are claims that the GCSE is a white elephant red herrings? Cambridge Assessment Research Report. Cambridge, UK: Cambridge Assessment.

2014

Teachers’ and employers’ views on the transition from GCSE mathematics to A-level mathematics or employment
Rushton, N. and Wilson, F. (2014) Paper presented at the British Educational Research Association (BERA) conference, London, 23-25 September 2014
Course struggle, exam stress, or a fear of the unknown? A study of A level students’ assessment preferences and the reasons behind them
Suto, I., Elliott, G., Rushton, N. and Mehta, S. (2014). Course struggle, exam stress, or a fear of the unknown? A study of A level students’ assessment preferences and the reasons behind them. Educational Futures (ejournal of the British Educational Studies Association), 6(2).
Common errors in Mathematics

Rushton, N. (2014). Common errors in Mathematics. Research Matters: A Cambridge Assessment publication, 17, 8-17.

When answering Mathematics questions, students often make errors leading to incorrect answers or the loss of accuracy marks. Many of these errors will be random, occurring through calculation errors or misreading of the question, and will not affect many candidates. However, some errors may be seen in a number of students’ scripts. These are sometimes referred to as common errors.

The aim of this study was to identify common errors that have been made in Mathematics exams. Three Mathematics specifications were used in this study: IGCSE Mathematics (0580), GCSE Mathematics A (J512) and GCSE Mathematics B (J567). Copies of the examiners’ reports and exam papers were obtained for all three qualifications for June 2009, 2010, 2011 and 2012. Within each examiner’s report, any common errors that candidates made were coded against a theme and sub-theme. The results were intended to inform the redevelopment of the mathematics qualifications, and to provide useful information for teachers and examiners. 

2013

Changing times, changing qualifications

Rushton, N. (2013). Changing times, changing qualifications. Research Matters: A Cambridge Assessment publication, 16, 2-9.

During recent years there have been many changes in education and assessment in England. Since 2000, curricula have been updated, particular skills have been included then removed from assessment, several new qualifications for students in English secondary schools have been added to the Register of Regulated Qualifications and other qualifications have been withdrawn. When so many changes occur in a short space of time it is difficult to keep track of them, and the time at which they happened.

This article tracks some of the changes that have occurred in England since 2000. The article is divided into three sections: qualifications being added and withdrawn from the Register of Regulated Qualifications; changes to GCSEs (including the proposed English Baccalaureate Certificates); and changes to A levels. For each section, a time line is included to provide an overview of the most important dates alongside a summary of the major events associated with each qualification/reform.

Independent research at A level: Students’ and teachers’ experiences

Mehta, S., Suto, I., Elliott, G. and Rushton, N. (2013). Independent research at A level: Students’ and teachers’ experiences. Research Matters: A Cambridge Assessment publication, 15, 9-16. 

Our aims were to explore teachers’ and students’ experiences and perspectives of independent research at A level. The study focused Economics, French and Mathematics. It investigated: (i) the extent to which teachers think research and investigative skills can be developed at A level; (ii) the resources and guidance that students use; and (iii) whether subject-specific differences arise. A questionnaire and follow-on interview methodology was used. 47 Mathematics teachers, 24 Economics teachers and 15 French teachers participated. Additionally, 299 Mathematics students, 228 Economics students and 136 French students took part.

About half of the French and Economics teachers were found to assign investigative/research tasks to their students at least once a fortnight. On the other hand, about half of the Mathematics teachers set such tasks less often and a further 40% never set them at all. The frequency with which the teachers set investigation/research tasks as homework/private study showed the same subject-specific differences as the classroom context. The internet was the most frequently listed source that students across all three subjects consulted while engaging in independent research. The interview data shed further light on general and specific internet usage. Overall, the findings explain some of the variation in preparedness of new undergraduates for independent study and research-related tasks at university.

2012

The effect of scripts’ profiles upon comparability judgements

Rushton, N. (2012). The effect of scripts’ profiles upon comparability judgements. Research Matters: A Cambridge Assessment publication, 14, 10-17.

Comparability studies often involve experts’ judging students’ scripts to decide which is better. Sometimes this involves experts judging more than one paper/component from a student. Students frequently achieve a higher level in one paper/component than they do on another. The scripts from these students are described as having an uneven profile.

Uneven profile scripts have been identified as a cause of difficulty for those making judgements in comparability studies. This study investigated whether it was harder to compare uneven profiled scripts and whether uneven profiled scripts were judged more harshly.

It found that the profile of the script only affected the difficulty of making comparisons in English Literature, where the effect varied by judge. Uneven profile scripts were slightly more likely to win their comparisons, but this depended on the judge. These findings suggest that the outcome of comparability studies could be affected by the existence of uneven profile scripts.

2011

Going beyond the syllabus: A study of A level Mathematics teachers and students.
Suto, I., Elliott, G., Rushton, N., and Mehta, S. (2011) Educational Studies.
The pitfalls and positives of pop comparability

Rushton, N., Haigh, M., and Elliott, G. (2011). The pitfalls and positives of pop comparability. Research Matters: A Cambridge Assessment publication, Special Issue 2, 52-56. 

The media debate about standards in public examinations has become an August ritual. The debate tends to be polarised with reports of ‘slipping standards’ at odds with those claiming that educational prowess has increased. Some organisations have taken matters into their own hands, and have carried out their own studies investigating this. Some of these are similar to academic papers; others are closer in nature to a media campaign. In the same way as ‘pop psychology’ is a term used to describe psychological concepts which attain popularity amongst the wider public, so ‘pop comparability’ can be used to describe the evolution of a lay-person’s view of comparability. Studies, articles or programmes which influence this wider view fall into this category and are often accessed by a much larger audience than academic papers. In this article, five of these studies are considered: Series 1 of the televised social experiment “That’ll Teach ‘em”; The Royal Society of Chemistry’s Five-Decade Challenge; the Guardian’s and the Times’ journalists (re)sitting examinations to experience their difficulty; a feature by the BBC Radio 4 programme, ‘Today’ (2009), where students discussed exam papers from 1936; and a book of O level past papers and an associated newspaper article which described students’ experiences of sitting the O level exams.

What form of feedback most motivates students? A study of teachers' perceptions of the Impact of assessment
Rushton, N., Suto, I., Elliott. and Mehta, S. (2011). Paper presented at the AEA-Europe annual conference, Belfast, November 2011.
Comparing specifications in a diverse qualifications system: instrument development
Greatorex, J., Rushton, N., Mehta, S. and Hopkin, R. (2010). Paper presented at the British Educational Research Association annual conference, University of London Institute of Education, September 2011.
Independent research at A level: students' and teachers' experiences
Mehta, S., Suto, I., Elliott, G. and Rushton, N. (2011). Paper presented at the British Educational Research Association annual conference, University of London Institute of Education, September 2011.
Going beyond the syllabus: views from teachers and students of A level mathematics
Suto, I., Elliott, G., Rushton, N. and Mehta, S. (2011). Paper presented at the British Educational Research Association annual conference, University of London Institute of Education, September 2011.
Small is beautiful? An exploration of class size at A level
Rushton, N., Suto, I., Elliott, G. and Mehta, S. (2011). Paper presented at the British Educational Research Association annual conference, University of London Institute of Education, September 2011.
Comparing specifications from diverse qualifications: instrument development
Greatorex, J., Rushton, N., Mehta, S. and Hopkin, R. (2011).  Paper presented at the Journal of Vocational Education and Training International conference, Oxford, July 2011.
Developing a research tool for comparing qualifications

Greatorex, J., Mehta, S., Rushton, N., Hopkin, R. and Shiell, H. (2011). Developing a research tool for comparing qualifications. Research Matters: A Cambridge Assessment publication, 12, 33-42.

Comparability studies about qualification standards generally use demand or candidates’ performance as comparators. However, these can be unrepresentative for vocational and new qualifications. Consequently, other comparators need to be used. This article details the process of devising and piloting a research instrument to compare the features of cognate units from diverse qualifications and subjects.

First, knowledge was elicited from twelve experts through Kelly’s repertory grid interviews where they were asked to compare different types of qualifications. This data was analysed thematically.  Four features and several sub-features were identified. These features were used to categorise the interview data and develop the research instrument. A pilot of the instrument indicated that salient features varied between units. Therefore, the instrument is suitable for use in future comparability studies about features. However, conventions still need to be agreed for how to analyse the data that is collected using the instrument.

Why study Economics? Perspectives from 16 to 19 year old students
Mehta, S., Suto, I., Elliott, G. and Rushton, N. (2013). Paper presented at the International Association for Citizenship, Social and Economics Education annual conference, Bath Spa University, June 2013.
The accuracy of forecast grades for OCR A levels
Gill, T. and Rushton, N. (2011) Statistics Report Series No. 26
Why study economics? Perspectives from 16 to 19 year old students
Mehta, S., Suto, I., Elliott, G. and Rushton, N. (2011) Citizenship, Social and Economics Education

2010

Is CRAS a suitable tool for comparing specification demands from vocational qualifications?

Greatorex, J. and Rushton, N. (2010). Is CRAS a suitable tool for comparing specification demands from vocational qualifications? Research Matters: A Cambridge Assessment publication, 10, 40-44.

The aim of the research was to ascertain whether a framework of cognitive demands, known as CRAS, is a suitable tool for comparing the demands of vocational qualifications.  CRAS was developed for use with academic examinations and may not tap into the variety of demands which vocational qualifications place on candidates.  Data were taken from a series of comparability studies by awarding bodies and the national regulator.  The data were the frameworks (often questionnaires) used to compare qualifications in these studies.  All frameworks were mapped to CRAS.  It was found that most aspects of the various frameworks mapped to an aspect of CRAS.  However, there were demands which did not map to CRAS; these were mostly affective and interpersonal demands, such as working in a team.  Affective and interpersonal domains are significant in vocational qualifications; therefore, using only CRAS to compare vocational qualifications is likely to omit key demands from the comparison.

Research Matters

Research Matters 32 promo image

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.