-
Research Matters 23 - Foreword
Oates, T. (2017). Foreword Research Matters: A Cambridge Assessment publication, 23, 1.
A uK newspaper headline recently declared that “China uses drones to catch students cheating in exams” (Telegraph 2015, June 5). Education authorities in Luoyang, central China, used the latest generation of drones, which “From heights of up to 1,640 feet…will be able to home in on radio signals created by students who are using hidden earpieces to obtain the answers to exam questions…”. New technology, old problem.
Download
-
Research Matters 23 - Editorial
Green, S. (2017). Editorial. Research Matters: A Cambridge Assessment publication, 23, 1.
The first three articles in this issue feature the use of technology, albeit in very different contexts. The last two articles move away from the technology theme and focus on qualifications that assess complex skills and competence.
Download
-
Tweeting about exams: Investigating the use of social media over the summer 2016 session
Sutch, T. and Klir, N. (2017). Tweeting about exams: Investigating the use of social media over the summer 2016 session. Research Matters: A Cambridge Assessment publication, 23, 2-9.
In recent years, social media discussion of particular GCSE and A level exams and questions has led to coverage in the national media. Using exam-related tweets collected from Twitter in real time, we investigated the extent of this phenomenon, the topics being discussed and the sentiments being expressed. We quantified sentiment by monitoring the occurrence of popularly used emoji within the tweets. We found that the overall volume of tweets followed weekly and daily patterns, with activity peaking in the periods just before and after exams. Discussion of particular subjects was concentrated on days when relevant exams took place. When we focused on the Mathematics GCSE papers sat on a particular day, we were able to identify several distinct phases based on the words and emoji used in tweets: discussion switched from revision to wishing others luck before the exam, then reflecting on performance and discussing individual questions afterwards.
Download
-
The clue in the dot of the ‘i’: Experiments in quick methods for verifying identity via handwriting
Benton, T. (2017). The clue in the dot of the ‘i’: Experiments in quick methods for verifying identity via handwriting. Research Matters: A Cambridge Assessment publication, 23, 10-16.
This article demonstrates some simple and quick techniques for comparing the style of handwriting between two exams. This could potentially be a useful way of checking that the same person has taken all of the different components leading to a qualification and form one part of the effort to ensure qualifications are only awarded to those candidates that have personally completed the necessary assessments. The advantage of this form of identity checking is that it is based upon data (in the form of images) that is already routinely stored as part of the process of on-screen marking. This article shows that some simple metrics can quickly identify candidates whose handwriting shows a suspicious degree of change between occasions. However, close scrutiny of some of these scripts provides some reasons for caution in assuming that all cases of changing handwriting represent the presence of imposters. Some cases of apparently different handwriting also include aspects that indicate they may come from the same author. In other cases, the style of handwriting may change even within the same examination response.
Download
-
Evaluating blended learning: Bringing the elements together
Bowyer, J. and Chambers, L. (2017). Evaluating blended learning: Bringing the elements together. Research Matters: A Cambridge Assessment publication, 23, 17-26.
This article provides a brief introduction to blended learning, its benefits and factors to consider when implementing a blended learning programme. It then concentrates on how to evaluate a blended learning programme and describes a number of published evaluation frameworks. There are numerous frameworks and instruments for evaluating blended learning, although no particular one seems to be favoured in the literature. This is partly due to the diversity of reasons for evaluating blended learning systems, as well as the many intended audiences and perspectives for these evaluations. The article concludes by introducing a new framework which brings together many of the constructs from existing frameworks whilst adding new elements. It is aim is to encompass all aspects of the blended learning situation to permit researchers and evaluators to easily identify the relationships between the different elements whilst still enabling focussed and situated evaluation.
Download
-
An analysis of the effect of taking the EPQ on performance in other Level 3 qualifications
Gill, T. (2017). An analysis of the effect of taking the EPQ on performance in other Level 3 qualifications. Research Matters: A Cambridge Assessment publication, 23, 27-34.
The Extended Project Qualification (EPQ) is a stand-alone qualification taken by sixth form students. It involves undertaking a substantial project, where the outcome can range from writing a dissertation or report to putting on a performance. It is possible that some of the skills learnt by students whilst undertaking their project (e.g. independent research, problem-solving) could help them in other qualifications taken at the same time. Two separate investigations were undertaken: firstly, the performance of individual students was analysed, using a multilevel regression model to compare EPQ and non-EPQ students. The results showed that there was a small, but statistically significant effect, with those taking EPQ achieving better results on average in their A levels. The second investigation analysed performance at school level, using a regression to model the effect of increasing the percentage of students in a school taking EPQ. The results showed a significant and positive effect of increasing the percentage of students taking EPQ. However, the effect was very small.
Download
-
A review of instruments for assessing complex vocational competence
Greatorex, J., Johnson, M. and Coleman, V. (2017). A review of instruments for assessing complex vocational competence. Research Matters: A Cambridge Assessment publication, 23, 35-42.
The aim of the research was to explore the measurement qualities of checklists and Global Rating Scales [GRS] in the context of assessing complex competence. Firstly, we reviewed the literature about the affordances of human judgement and the mechanical combination of human judgements. Secondly, we reviewed examples of checklists and GRS which are used to assess complex competence in highly regarded professions. These examples served to contextualise and elucidate assessment matters. Thirdly, we compiled research evidence from the outcomes of systematic reviews which compared advantages and disadvantages of checklists and GRS. Together the evidence provides a nuanced and firm basis for conclusions. Overall, literature shows that mechanical combination can outperform the human integration of evidence when assessing complex competence, and that therefore a good use of human judgements is in making decisions about individual traits, which are then mechanically combined. The weight of evidence suggests that GRS generally achieve better reliability and validity than checklists, but that a high quality checklist is better than a poor quality GRS. The review is a reminder that including assessors in designing assessment instruments processes can helps to maximise manageability.
Download
-
Statistics Reports
The Research Division. (2017). Statistics Reports. Research Matters: A Cambridge Assessment publication, 23, 43.
The ongoing Statistics Reports Series provides statistical summaries of various aspects of the English examination system, such as trends in pupil uptake and attainment, qualifications choice, subject combinations and subject provision at school. This article contains a summary of the most recent additions to this series.
Download
-
Research News
Barden, K. (2017). Research News. Research Matters: A Cambridge Assessment publication, 23, 44-45.
A summary of recent conferences and seminars, and research articles published since the last issue of Research Matters.
Download
-
Data Bytes
Data Bytes. (2017). The Data & Analytics Team. Research Matters: A Cambridge Assessment publication, 23, 46.
Data Bytes is a series of data graphics from CambridgeAssessment’s Research Division that is designed to bring the latest trends and research in educational assessment to a wider audience. The series can be found at http://www.cambridgeassessment.org.uk/our-research/data-bytes/
Download