Research Matters 22

  • Research Matters 22 - Foreword

    Oates, T. (2016). Foreword. Research Matters: A Cambridge Assessment publication, 22, 1.

    Download

  • Research Matters 22 - Editorial

    Green, S. (2016). Editorial. Research Matters: A Cambridge Assessment publication, 22, 1.

    Download

  • Revisiting the topics taught as part of an OCR History qualification

    Dunn, K., Darlington, E. & Benton, T. (2016). Revisiting the topics taught as part of an OCR History qualification. Research Matters: A Cambridge Assessment publication, 22, 2-8.

    Given the introduction of a broader range of options in OCR's new A level history specification, this article follows on from a previous analysis of A level History options based on the previous specification for OCR History (Specification A). That research relied on OCR History centres responding to requests for participation in an online survey. However, OCR’s introduction of an online ‘specification creator’ tool for centres has provided quantitative information about the topics which schools intend to teach their students as part of their A level.  As with the previous study, we sought to establish what the common topic choices and combinations are.

    Download

  • Accounting for students' mathematical preparedness for Finance and Business degrees

    Darlington, E., & Bowyer, J. (2016). Accounting for students' mathematical preparedness for Finance and Business degrees. Research Matters: A Cambridge Assessment publication, 22, 9-16.

    This article concerns the findings of a large-scale study by the authors of the maths needs of over 4,000 prospective undergraduates of STEMM and Social Science degrees. It reports the responses from Business and Finance-related students to an online survey containing questions about the Maths they studied at A-level, how well they perceived the A-level to be as preparation for the mathematical component of their degree, and ways in which they believe A-level Maths could be improved to suit the needs of future prospective undergraduates in their field. Generally, students of Business and Finance-related degrees who had taken A-level Maths were positive about it as preparation for their degree, with Statistics units perceived as the most useful to study if preparing for tertiary Business or Finance.

    Download

  • Collaboration in the 20th century: Implications for assessment

    Child, S. & Shaw, S. (2016). Collaboration in the 20th century: Implications for assessment. Research Matters: A Cambridge Assessment publication, 22, 17-22.

    Collaboration has recently been identified as an important educational outcome in its own right, rather than just a means to develop or assess knowledge. When assessing collaboration, there is a need for a clear understanding of what is being tested, based on a theoretically-sound and agreed upon definition. In light of this important issue, this article first provides an overview of how collaboration is conceptualised, and how it is distinguished from other related group activities (e.g., cooperation). The article then moves on to discuss how different conceptualisations of collaboration underpin the development of appropriate methods of assessment. Specifically, we explore how the task given to students can optimise (or impinge) the opportunities for collaboration to occur amongst group members, and the issues raised in the development of large-scale assessment of this so-called 21st century skill.

    Download

  • The effect of subject choice on the apparent relative difficulty of different subjects

    Bramley, T. (2016). The effect of subject choice on the apparent relative difficulty of different subjects. Research Matters: A Cambridge Assessment publication, 22, 23-26.

    Periodically there is interest in whether some GCSE and A level subjects are more ‘difficult’ than others.  Because students choose which subjects they take from a large pool of possible subjects, the matrix of data to be analysed contains a large amount of non-random missing data – the grades of students in subjects that they did not take.  This makes the calculation of statistical measures of relative subject difficulty somewhat problematic.  It is also likely to make subjects that measure something different to the majority of other subjects appear easier.  These two claims are illustrated in this article with a simple example using simulated data.

    Download

  • On the impact of aligning the difficulty of GCSE subjects on aggregated measures of pupil and school performance

    Benton, T. (2016). On the impact of aligning the difficulty of GCSE subjects on aggregated measures of pupil and school performance. Research Matters: A Cambridge Assessment publication, 22, 27-30.

    It is empirically demonstrated that adjusting aggregated measures of either student or school performance to account for the relative difficulty of General Certificate of Secondary Education (GCSE) subjects makes essentially no difference. For either students or schools, the correlation between unadjusted and adjusted measures of performance exceeds 0.998. This indicates that suggested variations in the difficulty of different GCSE subjects do not cause any serious problems either for school accountability, or for summarising the achievement of students at GCSE.

    Download

  • Statistical moderation of school-based assessment in GCSEs

    Williamson, J. (2016). Statistical moderation of school-based assessment in GCSEs. Research Matters: A Cambridge Assessment publication, 22, 30-36.

    Moderation of school-based assessment (SBA), such as coursework, is required in order to ensure the comparability of marks across different centres. Under current procedures for GCSEs, moderators re-mark a sample of SBA from each centre in order to check whether any adjustment should be made to that centre’s marks. This article explores statistical moderation, an alternative form of moderation that calibrates SBA marks on the basis of a statistical relationship with another assessment, such as an exam component. The article outlines methods of statistical moderation that are used in jurisdictions around the world, and explores the effect of applying these methods to results data from three GCSEs. The analysis focuses on comparing the statistically moderated results to operational results (moderated under existing, non-statistical procedures) in terms of marks, grades, and the rank-order of candidates and centres.

    Download

  • Good - better - best? Identifying highest performing jurisdictions

    Elliott, G. (2016). Good - better - best? Identifying highest performing jurisdictions. Research Matters: A Cambridge Assessment publication, 22, 37-38.

    We have become used to references to high-performing jurisdictions, but there are now many different published rankings of jurisdictions, each of which identifies high-performers. In consequence, the number of jurisdictions which fulfil the description of being high-performing has grown to a sizeable number. Sometimes, for practical research purposes, it is desirable to identify a smaller number of the highest performing jurisdictions at a given point in time. This article explores a strategy for doing so, based upon evaluating the position of a jurisdiction across a number of different rankings.

    Download

  • Research News

    Barden, K. (2016). Research News. Research Matters: A Cambridge Assessment publication, 22, 39.

    A summary of recent conferences and seminars, and research articles published since the last issue of Research Matters.

    Download

  • Statistics Reports

    The Research Division. (2016). Statistics Reports. Research Matters: A Cambridge Assessment publication, 22, 41.

    The on-going Statistics Reports Series provides statistical summaries of various aspects of the English examination system, such as trends in pupil uptake and attainment, qualifications choice, subject combinations and subject provision at school.  This article contains a summary of the most recent additions to this series.

    Download

  • Introducing Data Bytes

    Keirstead, J., Sutch, T. & Klir, N. (2016). Introducing Data Bytes. Research Matters: A Cambridge Assessment publication, 22, 42-43.

    Data Bytes is a series of data graphics from CambridgeAssessment’s Research Division that is designed to bring the latest trends and research in educational assessment to a wider audience. The series can be found at http://www.cambridgeassessment.org.uk/our-research/data-bytes/

    Download

Data Bytes

A regular series of graphics from our research team, highlighting the latest research findings and trends in education and assessment.