Kirsty Parkinson is the Awarding Assessment Manager at The Chartered Institute of Procurement & Supply. After a career in quality management, Kirsty found a passion for assessment and a desire to take a deep dive into the nuts and bolts of it. In 2019 she enrolled on the Postgraduate Certificate in Educational Assessment (PGCA), which is run jointly by Cambridge Assessment Network and Cambridge University Faculty of Education, to explore the how and why of assessment. Here she talks to us about how the course equipped her with the knowledge and tools to evaluate and innovate in her assessment practice.
Hi Kirsty, firstly would you like to tell us a bit about your career to date?
I've been at the Chartered Institute for Procurement & Supply (CIPS) for seven years and I'm currently the Awarding Assessment Manager. I look after CIPS regulated exams, which are taken all around the globe, in the UK, Africa and the Middle East.
My background is in quality management and legal compliance and I previously worked in the food industry and professional services. I think quality management prepared me really well for working in assessment, as in a lot of ways it's the same.
Assessment is about getting it right first time. It's not fixing it after the exam has been sat. It’s also about being able to pick up a framework and understand how that should be applied which, as it turns out, is a very transferable skill.
Did the world of assessment surprise you in any way?
The world of assessment was hugely surprising to me. I think I expected it to be more black and white.
I was surprised at how varied it is; how many different types of assessment there are and how detailed the quality measurement has to be when you're doing it properly. I found it really, really interesting, it grabbed my attention straight away.
I was fortunate that when I started my role at CIPS that there were lots of people I was able to learn from, so I wasn't in the deep end doing the job by myself.
Thinking back to when you enrolled on the PGCA, what was your motivation behind that?
CIPS had a certain way of doing things, and that was all very well and good. But it wasn't always clear to me why we were doing those things. I needed to understand more about the framework that those processes sat within and why they were there.
I also knew that we were on the cusp of change, changing assessment methodologies, changing from paper to online. So knowing that we had a few years of quite transformational change ahead of us, I had lots of questions I needed to work out the answers to. And these are not things you can find on Google!
Plus, I was just very curious. I like to understand things in depth. I don't just want a surface level understanding. I want to really get into the weeds of things, to understand why we do the things that we do, and see if there's a better way.
On a personal level, I felt I'd become a bit of a Jack of all trades, master of none. I've got all sorts of strange qualifications in my toolbox from pest control, to food safety, to health and safety risk assessments.
So I really wanted to become an expert in assessment. Because assessment grabbed my curiosity and it felt like I found my home.
Do you have an example of a question that you needed the answer to, but you didn't have?
We have a process for standardisation before we mark exams, but there's nothing out there that says a variance of three marks is good and a variance of 10 marks is bad. That's an example of something you can't find online. There's so much you need to know. There are a lot of questions underneath.
That's why Google doesn't just come up with a number. Because the answer is, ‘it depends’.
If you don't do this kind of training, you can't know the answers to these questions. And you might think everyone else knows except you. But they don't if they haven't understood the core principles of what validity is and what it is you're trying to achieve.
What were your key takeaways from your learning on the PGCA?
So one of the big takeaways for me was having access to all the research. I actually kept doing more research than the PGCA required me to do because I'm so curious and I kept finding things that were interesting. And the great thing about the PGCA is it teaches you how to evaluate that research.
I used to find it quite daunting, I would read a piece of research and think ‘I've got no idea what that just said’, but the learning trains your brain to understand what peer reviewed research looks like, what you should be looking for and what bits you can discount. It made it a lot less intimidating than it had been before.
Another big takeaway for me was thinking about stakeholder perception as part of validity. Because I'm quite a purist and I think I would have said, well, these stakeholders may think that, but they're wrong. So it doesn't matter.
But actually it does because if the stakeholders perception isn't in line with the purpose of your assessment, then you haven't achieved the purpose. It seems so obvious now, but that was a huge light bulb moment for me.
So did you change your practice in light of that?
It changed nearly every part of my practice. Now, whenever I'm asked to write a paper on evaluating different assessment methodologies or changing a process, I will always include ‘how is this going to impact our stakeholders perception of this qualification?’ That’s top of the list.
It doesn't mean that we're not going to do something because stakeholders won't perceive it well. It just means there's another piece of work to do to communicate why we're making the change. We can't just expect people to buy it.
How did the research you did on the PGCA equip you for later on and your continued learning?
It made my life so much more straightforward, and less stressful. When somebody asks a tricky question, I don't have to worry about who on earth I can possibly ask that has more experience and would know the answer.
Instead, I know I just need to gather information and gather more than one view point and pull it all together and apply it to the context of the exam and the product that I'm dealing with.
And if you do the research and you find no one can agree, well, then there you go. There's your answer. No one can agree, so you have to go with the best of the information that you have at the time.
Are there any recent innovations in your assessment practice that you could share?
We're currently preparing ourselves for the possibility of on demand assessments which will be quite an undertaking for us and it will mean changing our body of evidence for validity. For example pretesting exam questions, how do you achieve validity there? And what are we going to do about exam results?
We have a certain moderation process that happens before we release exam results, but it will be different if we release the results straight away. We also need to consider our stakeholders, what are they going to perceive to be a valid approach?
But I think the direct impact from this particular learning (on the PGCA) is that it's helped us to be able to look at all of our processes and ask, are they the most appropriate for the product and the purpose of the exam? Are there things we're doing that actually we don't need to do? What are the bits that really add value to that validity argument?
So I think it's not just about learning to do new things, it's learning to evaluate what you're already doing.
Do you consider yourself to have an assessment identity, and if so, how would you describe that?
It's interesting, I've noticed in the last six months or year, that I keep being described by other people as CIPS’ assessment expert. And that's really, really nice. So I think that's sort of become my identity.
We've had others assessment experts at CIPS that have since moved on. So I guess I've moved into that space!
I think it's given me more confidence to take a look at some of the other assessment products that we do at CIPS and offer help to my colleagues. I think it's given me a progression path that wasn't necessarily there before.
This is part of a new series of stories about our assessment practitioner community. Some of the themes here can be explored further in our assessment professional learning framework, a statement about what we think meaningful professional learning in assessment looks like and how it can be achieved. It provides a structure for thinking about how we can ensure positive impacts of professional learning and how they can be measured.
Something as important as assessment benefits from sharing perspectives, exchanging ideas and debating the latest thinking. As the Assessment Network, we want to bring assessment practitioners together to share greater understanding. Why not join us?