Digital research methods in action

Digital research methods in action

Collaboration with external education experts (consultants and examiners) is essential to the research done in the Research Division and we face a constant challenge in making our co-operation with them as time- and cost-effective as possible. This year's work using CRAS (a qualitative framework to analyse the cognitive demands of exam questions) saw a long-running project take a significant step forward when the examiners involved employed computer-based data capture of their judgements for the first time.

In 1998 a research team from the Research Division developed the CRAS framework (Hughes, Pollitt & Ahmed, 1998). The original framework had four dimensions:

Complexity:

This refers to the number of relevant elements which comprise a task and the links between them. A task of high demand on the complexity dimension would require a lot of mental processing of different components, for example in a language examination this might be words, grammatical forms or other pieces of information. A task of low demand would use a more limited number of elements.

Resources:

This refers to the candidate’s use of data in completing a task. A task of high demand on the Resources dimension would require the candidate to draw upon their own knowledge or be competent in selecting appropriate data from a given range. A low demand task would provide most or all of the necessary information.

Abstractness:

This refers to the extent to which a task employs ideas rather than concrete objects. A task of high demand on the Abstractness dimension would use content that lay outside the candidate’s direct experience, whereas a task of low demand would deal with phenomena more rooted in physical everyday life.

Strategy:

This refers to the demands placed upon a candidate in forming and executing strategies to complete a task successfully. A candidate responding to a task of high demand on this dimension would need to develop, organise and monitor their responses and possibly have to select content from a larger pool. A candidate responding to a task with low demand would find the strategy apparent in the task itself, and (s)he would have little need to monitor or organise, and may be required to select information from a smaller pool.

CRAS offers a generic framework for analysing the cognitive demands that tasks place on students. In each study the researcher(s) need to establish whether their context is suitable for the application of CRAS, and if so interpret the dimensions within their context (Johnson & Mehta, 2011). The framework has been used in much comparability and validity research (Shaw & Crisp, 2012; Johnson, 2012; Johnson & Mehta, 2011).

Previously, CRAS studies unfolded as follows:

Generally the first stage was to develop a shared understanding of cognitive demands (as described by CRAS). An example of this is described in QCA (2008). To this end participants (examiners) attended a meeting when they discussed a description of cognitive demands and used it to judge demands of some questions. Elements of participants' discussion were added to the original CRAS descriptors to capture how cognitive demand was interpreted in relation to the target assessment. At this stage any misunderstandings of the CRAS framework could also be sorted out.

The second stage was the main data collection. The participants were given many documents to use: examples include written instructions, response sheets, question papers, mark schemes, the updated version of CRAS, and a list of documents to return. The packages could be a couple of centimetres thick. Each participant individually rated the cognitive demands each question placed on candidates where “1” is low demand and “5” indicates high demand. Ratings were made for each dimension of CRAS, and entered on the hardcopy recording sheets. The work was undertaken in a secure place. Once the judgements were recorded all the paperwork was returned to the researchers, who checked that all the necessary data had been provided and that all the documents had been returned.

This year the project needs of the research team and the participants' perceptions were communicated via links, ratings and comments using an internet-based data collection tool. There were several advantages to this approach.

Remote access to the materials meant there was no need to hold a meeting with participants to discuss the theory and practice with them. Naturally, arranging meetings which were convenient for all parties was time consuming and complex in the past; the new approach freed up time for researchers and participants alike. However, time was spent finding an appropriate way of sharing documents online and arranging access for the participants. Time, paper and money were saved on the sending and receiving of completed responses online instead of by post. This had the added advantage of removing the need for researchers to key in the participants' responses once they had been received, making the process quicker and reducing the chance of error in data recording.

As might be expected with a relatively new approach, some technical problems and issues arose in its execution. We worked with participants to overcome these, which meant there was no impact on the study in the sense that all data was gathered and no one dropped out.

The success of the recent innovations will help the Research Division (and other CRAS users) to develop new practices for long-term improvements in research and data collection.

Jackie Greatorex, Jo Ireland and David Beauchamp
Research Division, Cambridge Assessment





References

Hughes, S., Pollitt, A., & Ahmed, A. (1998). The development of a tool for gauging the demands of GCSE and A Level exam questions. Paper presented at the British Educational Research Association, The Queen's University, Belfast. 

Johnson, M. (Speaker). (2012). Research methods in action - using the CRAS framework and Kelly's Repertory Grid technique [Audio podcast].

Johnson, M., & Mehta, S. (2011). Evaluating the CRAS Framework: Development and recommendations. Research Matters: A Cambridge Assessment Publication, 12, 27-33.

QCA (2008) Qualifications and Curriculum Authority. (2008). Study 2a: A level biology, psychology and sociology Inter-subject comparability studies. London: Crown Copyright.

Shaw, S. D., & Crisp, V. (2012). An approach to validation: Developing and applying an approach for the validation of general qualifications. Research Matters: A Cambridge Assessment Publication, Special Issue 3.Pg 3.

Research Matters

Research Matters 32 promo image

Research Matters is our free biannual publication which allows us to share our assessment research, in a range of fields, with the wider assessment community.