-
Research Matters 38 - Foreword
Oates, T. (2024). Foreword. Research Matters: A Cambridge University Press & Assessment publication, 38, 4.
With 2024 as the third year in which exams were taken in lower and upper secondary, there is a pervasive sense of "getting back to pre-pandemic arrangements". Proposals for change are coming thick and fast – some stimulated by the impact of COVID-19 and some nascent in the system from before the pandemic. The assessment research community continues to run hot trying to run national exams, understand what has happened over the past five years, and service calls for changed arrangements. Let’s all work to keep both the profession and policy makers serviced with high-quality evidence which relates to the questions that matter most.
Download
-
Research Matters 38 - Editorial
Crisp, V. (2024). Editorial.Research Matters: A Cambridge University Press & Assessment publication, 38, 5.
Our first article explores disruption to school life in the context of the COVID-19 pandemic, identifies a set of macro- and micro-strategies used by schools and discusses the broader implications for emergency readiness. Our second article addresses a decision that has to be made about any high stakes test for any qualification type and subject: how long the test should be. Our third article explores the background characteristics of those taking Core Maths, the other qualifications and subjects these learners also study, and whether taking Core Maths is associated with better results in other qualifications with a quantitative element. Our fourth article reviews the existing literature on the comparability of typed and handwritten long-answer responses, in terms of scores, text characteristics, marking, and composing processes. Our final article explores the use of the Comparative Judgement method for the assessment of music compositions and performances based on audio recordings.
Download
-
Troubleshooting in emergency education settings
Constantinou, F. (2024). Troubleshooting in emergency education settings: What types of strategies did schools employ during the COVID-19 pandemic and what can they tell us about schools’ adaptability, values and crisis-readiness? Research Matters: A Cambridge University Press & Assessment publication, 38, 6-27. https://doi.org/10.17863/CAM.111626
With crises such as epidemics, wars, wildfires, earthquakes, and hurricanes becoming increasingly more common in various parts of the world, it is crucial that schools become crisis-ready. Crisis-readiness lies partly in the ability of schools to deliver “emergency education” (i.e., education in crisis situations) promptly and effectively. To support the delivery of emergency education, this study sought to document and examine the strategies employed by schools during a crisis, specifically the COVID-19 pandemic. Through analysing data collected from interviews with teachers based in different parts of Europe, the study identified a series of micro-level strategies used by schools to address the challenges posed by the pandemic. These micro-level strategies were subsequently analysed to develop a typology of overarching mechanisms, or macro-level strategies. As discussed in the article, apart from providing a useful starting point for any teachers required to deliver emergency education in the future, these emergency strategies also offer valuable insights into schools’ adaptability, values, and crisis-readiness. As such, they could prove very informative for both educational policy and practice.
Download
-
How long should a high stakes test be?
Benton, T. (2024). EHow long should a high stakes test be? Research Matters: A Cambridge University Press & Assessment publication, 38, 28-47. https://doi.org/10.17863/CAM.111627
This article discusses one of the most obvious questions in assessment design: if a test has a high stakes purpose, how long should it be?
Firstly, we explore this question from a psychometric point of view starting from the (range of) minimum test reliability levels suggested in the academic literature. Then, by using published data on the typical relationship between the length, duration and reliability of exams, we develop a range of recommendations about the likely required duration of assessment.
Secondly, to force deeper reflection on the results from the psychometric approach, we also compare the actual lengths of exams in England to those in other education systems around the world. Such comparisons reveal very wide variations in the amount of time young people are required to spend taking exams in different countries and at various ages. This article concludes with some reflections on how the length of exams relates to the purpose of the assessment or to how its results will be used.
Download
-
Core Maths: Who takes it, what do they take it with, and does it improve performance in other subjects?
Gill, T. (2024). Core Maths: who takes it, what do they take it with, and does it improve performance in other subjects? Research Matters: A Cambridge University Press & Assessment publication, 38, 48-65. https://doi.org/10.17863/CAM.111628
Core Maths qualifications were introduced into the post-16 curriculum in England in 2014 to help students develop their quantitative and problem-solving skills. Taking the qualification should also give students confidence in understanding the mathematical content in other courses taken at the same time.
In this article, we explore whether Core Maths is fulfilling its aims. In particular:
• Does Core Maths provide students with a benefit (in terms of attainment) in other, quantitative, Key Stage 5 subjects (e.g., A Level Psychology, BTEC Engineering)?
We also investigate some aspects of the uptake of Core Maths:
• What are the background characteristics of Core Maths students (e.g., gender, prior attainment, ethnicity)?
• Which other qualifications (e.g., A Levels, BTECs, Cambridge Technicals) and subjects are students most likely to take alongside Core Maths?
The main finding was that students taking Core Maths had a slightly higher probability (than those not taking Core Maths) of achieving good grades in some subjects taken concurrently. Uptake of Core Maths remains relatively low, so there is certainly scope for greater numbers of students to take advantage of the potential benefits of studying the qualification.
Download
-
Does typing or handwriting exam responses make any difference?
Lestari, S. (2024). Does typing or handwriting exam responses make any difference? Evidence from the literature. Research Matters: A Cambridge University Press & Assessment publication, 38, 66-81. https://doi.org/10.17863/CAM.111629
Despite the increasing ubiquity of computer-based tests, many general qualifications examinations remain in a paper-based mode. Insufficient and unequal digital provision across schools is often identified as a major barrier to a full adoption of computer-based exams for general qualifications. One way to overcome this barrier is a gradual adoption, involving a dual running of paper-based and computer-based exams. When an exam is offered in both modes, and results from both are treated as equivalent, the comparability between modes needs to be ascertained. This includes examining whether the mode in which students respond to extended writing questions such as essays, either by handwriting or by typing on the computer, introduces systematic differences. This article presents findings from a review of existing literature on writing mode effects. Specifically, it discusses findings on four comparability aspects: scores, marking, text characteristics and composing processes. It also offers recommendations for practice.
Download
-
Comparing music recordings using Pairwise Comparative Judgement: Exploring the judge experience
Chambers, L., Walland, E., & Ireland, J. (2024). Comparing music recordings using Pairwise Comparative Judgement: Exploring the judge experience Research Matters: A Cambridge University Press & Assessment publication, 38, 82-98. https://doi.org/10.17863/CAM.111630
Comparative Judgement (CJ) is traditionally and primarily used to compare written texts. In this study we explored whether we could extend its use to comparing audio files. We used GCSE Music portfolios which contained a mix of audio recordings, musical scores and text documents. Fifteen judges completed two exercises: one comparing musical compositions and one comparing musical performances. For each exercise, each judge compared 80 pairs of portfolios. Once judges had finished both exercises, they completed a questionnaire about their views and experiences of the method. Here, we present the judges’ perceptions of using CJ in this context with reference to the Dimensions of judge decision-making model (Leech & Chambers 2022). We also compare the findings to those from text-based CJ studies.
Leech, T., & Chambers, L. (2022). How do judges in Comparative Judgement exercises make their judgements? Research Matters: A Cambridge University Press & Assessment publication, 33, 31–47.
Download
-
Research Matters issue 38 - Research News
Bowett, L. (2024). Research News. Research Matters: A Cambridge University Press & Assessment publication, 38, 99-101.
A summary of recent conferences, reports, blogs and research articles published since the last issue of Research Matters.
Download