-
Research Matters 39 - Foreword
Oates, T. (2025). Foreword. Research Matters: A Cambridge University Press & Assessment publication, 39, 4.
After the shock of the "Pandemic Years" the educational news has shifted to grand movements in curriculum and assessment — system-level reviews in a substantial number of nations, assessment innovations driven by digital innovation and artificial intelligence (AI). It does feel as though tectonic plates are shifting. But innovation needs to continue at a micro as well as a macro level, driven by a commitment to accumulation of scientific knowledge about learning and measurement.
Download
-
Research Matters 39 - Editorial
Crisp, V. (2024). Editorial.Research Matters: A Cambridge University Press & Assessment publication, 39, 5.
Our first article explores whether taking Core Maths qualifications (at age 16 to 18 years) may have benefits for students during higher education. Our second article relates to using comparative judgement (CJ) to support decisions about setting grade boundaries. Our third article considers how certain features of exam question design could plausibly have implications for the accessibility of questions and explores the use of omit rates as a way to monitor for accessibility issues. Our fourth article describes research in which images of annotations were extracted from large samples of GCSE Combined Science and GCSE Mathematics exam scripts in order to derive frequencies of learner annotations, types of annotations, and annotation heat maps (that provide a visualisation of the frequency of annotations in different locations on or around a question). Our final article describes research in which economics learners took a digital multiple-choice exam with access to either scrap paper or a print of the test. The findings from the later two studies have potential implications for functionality within digital testing platforms.
Download
-
The impact of taking Core Maths on students’ higher education outcomes
Gill, T. (2025). The impact of taking Core Maths on students’ higher education outcomes. Research Matters: A Cambridge University Press & Assessment publication, 39, 6-25. https://doi.org/10.17863/CAM.116167
One of the main aims of Core Maths qualifications when they were introduced into the post-16 curriculum in 2014 was to help students develop their understanding of maths and its application to different subject areas, particularly in relation to further study (e.g., higher education). In this article, we explore whether Core Maths is fulfilling this aim. In particular, we answer the following questions:
• Are Core Maths students less likely than non-Core Maths students to drop out of higher education (HE) courses with a quantitative element?
• Is taking Core Maths associated with better degree performance in courses with a quantitative element?
We investigated these questions using logistic regression analysis. We found that Core Maths students had a slightly lower probability than non-Core Maths students of dropping out of HE in their first year, even after accounting for other factors likely to affect drop-out rates, such as prior attainment. The other main finding was that Core Maths students were slightly more likely to achieve a good degree classification.
These results suggest that taking Core Maths may benefit students taking a quantitative subject at HE, perhaps by giving them the skills they need to apply mathematical knowledge to their subject.
Download
-
Is one comparative judgement exercise for one exam paper sufficient to set qualification-level grade boundaries
Benton, T. (2025). Is one comparative judgement exercise for one exam paper sufficient to set qualification-level grade boundaries? Research Matters: A Cambridge University Press & Assessment publication, 39, 26-38. https://doi.org/10.17863/CAM.116168
This research draws on evidence from three qualifications taken in autumn 2020, when comparative judgement (CJ) was used as a key source of data in setting grade boundaries. In these cases, a separate CJ exercise was completed for each individual paper in the qualification so that standards could be maintained from a previous series. In this article, we explore what would have happened had we relied on a single CJ exercise on one paper to maintain standards in the whole qualification. We first examine whether evidence from different papers provides a consistent picture of changes in cohort ability between series. We then explore the impact of relying on evidence from one paper only on the precision with which we can identify appropriate qualification-level grade boundaries using CJ.
Download
-
Accessibility of GCSE science questions that ask students to create and augment visuals
Lestari, S. (2025). Accessibility of GCSE science questions that ask students to create and augment visuals: Evidence from question omit rates Research Matters: A Cambridge University Press & Assessment publication, 39, 39-65. https://doi.org/10.17863/CAM.116169
The ability to draw visual representations such as diagrams and graphs is considered fundamental to science learning. Science exams therefore often include questions which require students to draw a visual representation, or to augment a partially provided one. The design features of such questions (e.g., layout of diagrams, amount of answer space) could, however, influence students' ability to respond to the questions and present potential accessibility issues, which in turn could influence the validity of score inferences. This article reports on a small-scale study examining the accessibility of GCSE science questions involving the creation and augmentation of visuals (e.g., adding an element to a partially provided diagram) by analysing the patterns of question omit rates. Omit rates for questions involving creating or augmenting visuals were compared to those for questions without, and these comparisons were conducted across tiers, subjects, question position, maximum marks and facility values, as well as by gender and attainment group.
Download
-
How do candidates annotate items in paper-based maths and science exams
Williamson, J. (2025). How do candidates annotate items in paper-based maths and science exams? Research Matters: A Cambridge University Press & Assessment publication, 39, 66-89. https://doi.org/10.17863/CAM.116170
Teachers, examiners and assessment experts know from experience that some candidates annotate exam questions. “Annotation” includes anything the candidate writes or draws outside of the designated response space, such as underlining, jotting, circling, sketching and calculating. Annotations are of interest because they may evidence aspects of candidates’ response activity that would be overlooked when focusing on response spaces. We have some evidence on how candidates annotate their questions from mode effect studies comparing paper-based and digital assessments, but little information on which candidates annotate and how often they do so.
This article describes an exploratory study of annotations made by GCSE Combined Science and GCSE Mathematics candidates. The research analysed scripts from four random samples of 1000 candidates, one each from the Foundation and Higher tiers of each GCSE, and looked at the prevalence and types of annotation on different items. A particular motivation was to support the design of effective digital assessment in maths and science, through improving our understanding of candidates’ response activity in these subjects.
Download
-
Learners' annotations and written markings when taking a digital multiple-choice test: What support is needed?
Crisp, V., Vitello, S., Khan, A. A., Mahy, H., & Hughes, S. (2025). Learners' annotations and written markings when taking a digital multiple-choice test: What support is needed? Research Matters: A Cambridge University Press & Assessment publication, 39, 90-109. https://doi.org/10.17863/CAM.116171
This research set out to enhance our understanding of the exam techniques and types of written annotations or markings that learners may wish to use to support their thinking when taking digital multiple-choice exams. Additionally, we aimed to further explore issues around the factors that contribute to learners writing less rough work and markings on scrap paper during a digital test than they write on paper-based tests, as observed in prior research.
In this research, 52 learners attempted a digital economics test with access to either scrap paper or a print of the test. Some learners were observed in order to capture their interactions with the paper materials, all learners completed a questionnaire, and most learners were interviewed.
The evidence collected provides insights regarding the types of annotations and written markings learners wished to use. Considerable variation was found in whether, and the extent to which, learners used paper materials. Scrap paper worked fairly well for some types of annotations or written markings, but not for others. The findings are informing additional developments to testing platform functionality.
Download
-
Research Matters issue 39 - Research News
Bowett, L. (2025). Research News. Research Matters: A Cambridge University Press & Assessment publication, 39, 110-112.
A summary of recent conferences, reports, blogs and research articles published since the last issue of Research Matters.
Download