Paul Kelly is Qualifications Director at Professional Assessment Ltd (PAL), where he leads the Resource Development Team and oversees the assessment design and production process. PAL provides end-point assessment for a wide range of apprenticeship standards for training organisations and employers and also audit and compliance services for training providers. We spoke with Paul about how psychology can be applied to pedagogy and assessment, considerations for accessibility in end-point assessments and the flexibility offered by remote assessments for professional discussions.
This discussion first appeared in Perspectives on Assessment, the Cambridge Assessment Network member newsletter, which features key voices from the assessment community along with other member-exclusive content.
How did you end up specialising in assessment?
"I completed a Postgraduate Certificate in Education (PGCE) and started my teaching career in secondary school education, so I was already aware of the role of assessment, mainly formative at that point, in measuring learning and checking the progress of learners.
Early in my teaching career I also took on the part-time role of examiner and began marking GCSEs for a large awarding organisation. My eyes were opened to the craft of writing suitable questions and mark schemes that elicited legitimate responses from the candidates. I began to see what made for good questions and what didn’t and how mark schemes are not infallible and needed to be updated after exams had been taken and then trained out consistently to the examiners.
I moved through the ranks, learning so much from training provided by the awarding organisation and from the more senior assessment writers I worked with, going from being an examining team leader, item writer, then assistant chief examiner to eventually having the honour and challenge of writing full GCSE assessment papers and mark schemes in the subject of ICT and being responsible for the marking and awarding overall each summer.
I spent the last few of years of my teaching career in further and higher education and continued to enjoy designing my own assignments for any internal assessment elements, before taking my combined knowledge and experience, of both teaching learners and developing assessments, directly into working for awarding organisations and end-point assessment organisations (EPAOs). I have now spent the last seven years leading on assessment design, development, and performance, and associated policies and strategies."
You mentioned on the Cambridge Assessment Network forum your background in Psychology and how quantitative analysis can be applied to support validity and reliability – could you tell us a little more about this?
"I was keen to further study the scientific thinking behind the world of assessment and to do so I completed a MSc in Psychology. The field of psychology has a long history of involvement in questionnaire design and psychometrics and therefore in the use of research methods and validity and reliability statistical analysis.
I had the opportunity to complete assignments using statistical software to help conclude if studies have merits and if assumptions could be applied to wider populations. I very much bring this interest in performance statistics to my role and here at Professional Assessment Ltd (PAL) and we monitor the quality of our MCQ tests using all available validity and reliability data.
Looking at how psychology can be applied to pedagogy and assessment was also fascinating. There are a number of other areas of psychology that I studied which I now apply to assessment development and delivery in my EPAO, these include minimising the negatives impacts of cognitive demand on learners during professional discussion assessments and mitigating the risks of assessors showing unconscious bias when carrying out observation assessments."
You mention minimising cognitive demand on learners during professional discussion. What considerations might other practitioners working in this area want to think about?
"The issue of cognitive demand, in professional discussion and oral assessments in general, and the need for the learners to juggle thinking on their feet and meeting the assessment criteria with handling their nerves is very interesting to me.
To combat this somewhat we work to ensure that our assessors put learners at their ease as much as possible by building relationships and providing clear instructions on timings and structure and so on. We also provide assessors with protocols on how to use any follow-up questions fairly and consistently and we construct question banks for assessors including indicative content for grading purposes. We also advise our apprentices to practice as much as possible and have well-constructed portfolios to refer to, if underlying portfolios are allowed by the assessment plan.
I always refer to this Cambridge paper by Ayesha Ahmed et al, when discussing verbal assessment:
Assessing Thinking and Understanding: Can Oral Assessment Provide a Clearer Perspective?
What do you find most rewarding about your role?
"There is so much that I find rewarding and it would be very hard to pick just one. First of all, I relish the challenge of bringing apprentice standard assessment plans to life by designing and developing the valid assessments needed for the end-point assessment component.
These assessments, in line with the assessment plans, then allow apprentices to fairly demonstrate their competency and for our assessors to reach reliable judgements, which is something that makes me feel very proud. I really enjoy working with my resource development team to translate assessment blueprints and strategies into becoming the finalised and quality live assessments. Another big part of my enjoyment is working with our subject specialist associates, who use all their vast experience to collaborate with us to in creating the assessment materials.
Of course, there are also the statistics I mentioned earlier, for example when the statistics confirm that MCQs for a test are performing as they should, not too easy nor too difficult, and that items are differentiating correctly across the range of grades, it is very pleasing.
I have the opportunity every day to apply the science of assessment to ensure we are compliant, and our assessments are fit for purpose. and that is both intellectually stimulating and very rewarding. I also regularly have the opportunity to provide continual professional development for colleagues and share my assessment knowledge with the rest of my organisation."
What kind of challenges with accessibility do you encounter in the world of end point assessments?
"Like all organisations that develop and deliver assessments we face the challenge of delivering accessibility. It is our moral and regulatory duty to ensure that no learners are unfairly disadvantaged when taking our assessments. We have policies and procedures in place to prevent any artificial barriers to accessibility being inadvertently established and to meet the regulator’s conditions.
So, in practice this means many different factors including making sure that assessment instructions for learners are clear, consistent, and in plain English. It also means we work hard to ensure that language we use in assessments is free of bias and stereotypes, and that language is as accurate and clear as possible. A key part of accessibility that must not be forgotten is ensuring that assessments are fairly mapped to standards, assessment plans, and assessment criteria to make sure only valid content is being assessed.
We also of course allow reasonable adjustments, such as giving extra assessment time or providing a reader, where there is evidence to show that these adjustments are necessary.
We are currently piloting a text-to-speech software add-on to our assessment platform to see if this will further support learners with the challenges they face, by the use of natural-sounding voices to support accessibility and this is likely to be implemented fully in the near future."
Have you been involved with remote end point assessment in your current role, perhaps as a result of the pandemic?
"Yes, the pandemic brought in a number of changes, some which will remain, such as carrying out professional discussion assessments remotely. Our stakeholders enjoy the flexibility this remote assessment brings and because all professional discussion sessions are recorded this also brings a wealth of confidence in the quality of assessment delivery and grading decisions.
Our quality team can sample all professional discussion assessments in part or in full in line with our sampling strategy and they also have a bank of examples which can be used for assessor training and standardisation purposes.
The vast majority of our (MCQ) knowledge tests are taken remotely and onscreen. Again, this provides flexibility to suit local needs and the recording of the sessions, and a lack of the need to physically transport papers across the country, gives confidence in the security and integrity of our test processes and environments. We always provide a remote invigilator, and our tests are invigilated live on a 1-2-1 basis. It is important to have clear protocols on how remote tests are conducted and have your eyes open to the potential risk of malpractice in order to enable the benefits of remote assessment to be maximised."
Would you like to feature in a future Member Spotlight? We'd love to hear from you: thenetwork@cambridgeassessment.org.uk