Introducing one of our new Cambridge Assessment Network Ambassadors - Gwyneth Toolan. In this feature Gwyneth, Innovation Product Manager at RM in the UK, talks about teaching in Thailand, her "gratifying" experience on the Postgraduate Advanced Certificate, innovation and digital assessment, and why working on a "living topic" like assessment never gets dull!
You began your career in teaching. How and what did you learn about assessment during that time?
"When I was 23 I was teaching English literature in Thailand, prior to doing my PGCE in the UK, and we were expected to devise our own end of year assessments. I had no idea what I was doing! I suspect it mostly tested memory, knowledge and understanding and if it did test analysis and evaluation, it was because I copied the style of English assessments I was given when I was at school.
I later learnt that assessment is far more complex; the need for tests to be valid and fulfil a specific purpose is something I hadn’t really mused over until I embarked on the Postgraduate Advanced Certificate in Educational Studies: Educational Assessment at Cambridge. I realise now, that testing is in itself an imperfect art and there is an entire philosophical backdrop underpinning assessment validity. As a teacher in the UK, we worked closely with awarding organisations for guidance on assessment, and on course content too. I think this relationship is an essential part of the education system.
You mention your study on the Postgraduate Advanced Certificate – what was it about assessment that motivated you to study at that level?
After teaching English and then leading a Sixth Form for 12 years, I moved to work as an Assessment Manager at Cambridge International and it became clear there was still so much to learn about assessment design and assessment principles.
I love learning so for me embarking on the Postgraduate Certificate was a no brainer! "Learning for work is even more gratifying in my opinion, because of the practical opportunities to see learning happen." Shortly after completing my studies, I began re-designing the IGCSE sociology syllabus and was able to draw on my own research into the choice of command words in the IGCSE sociology and the new A Level sociology syllabus.
I was interested in the impact of command words on perceptions of level of demand or difficulty. It turns out that ‘to what extent’ commands ‘felt’ more difficult to test recipients than ‘evaluate’ when they read exam questions. Such insights helped inform the entire syllabus development process, meaning we could use evidence to justify changes and ensure our work was benefitting candidates by making assessments as accessible as possible. The correlation between theory and practice became concrete here and I found that really fulfilling!
Are there any particular learnings from that course that you find yourself reflecting on regularly?
At RM I am often approached by colleagues wanting teaching and assessment insights into projects they’re doing or to get my thoughts on new products in innovation. When we were working on our new assessment strategy, I was able to offer insights into potential customer perceptions of RM to help us improve our services, ensuring we remain sensitive to the needs of candidates, teachers, centres as well as examiners etc. This required me to draw on my assessment knowledge as well as my experience working in domain.
I regularly reflect on validity and its complexities. This comes into our innovation work in the Studio area of RM, where I work. We work hard to validate any of our hypotheses through user interviews and primary research to avoid building new Edtech based solely on assumptions. I reflect on the processes in addition to the principles of assessment, ensuring that delivery of technological solutions fulfils actual needs and fits with educational purposes, to ultimately save teachers and assessors time. An example of this work is when colleagues of mine from RM re-designed the user interface of RM Assessor 3. I remember working with examiners at Cambridge to gain feedback on behalf of RM, and was able to witness how positively they responded to the new user experience it offered. This relationship between the operational and the theoretical lies central to assessment and is something I am constantly thinking about.
What is your job role at RM?
The RM group is actually made up of three different businesses providing teaching and learning resources, software and services to schools and colleges, and high stakes e-assessment technology to governments and awarding organisations like Cambridge University Press & Assessment. My role is within the Assessment business working across Product Management and Innovation. There’s a large community within RM of teachers, ex-teachers, Governors, and educationalists which allows us to genuinely connect with our customers and the kinds of challenges that they face. We don’t just try and solve problems; we want to bring real empathy to why things matter to our customers.
Within the innovation arm, we have RM Studio which unites a small yet inclusive community of innovators, providing a supportive and safe environment for exploration of ideas, new techniques and risk taking. Within this environment, we promote a new way of working and are free to explore emerging digital pedagogies, maximising the potential of valuable opportunities for RM and our customers. We believe integrating our customers into our innovation process is essential for crafting advantageous products and services which facilitate our user's needs, engendering a greater quality of education.
The teams that I work with share a common culture of collaboration and open-mindedness. Innovation teams in particular are cross-functional by nature, meaning that everybody on the team has unique perspectives, skills, and experience. This style of working is often called Agile or Scrum methodology.
What’s more, in RM Studio we take a human-centred approach to the way we tackle problems. We co-design with users to build solutions that solve real problems for real people. We try to adhere to the principle of fail early, fail often. No one designs a product perfectly the first-time round, so we embrace the failure that comes with discovery. This is common practice at Google and other technology companies. You should expect failure and you should accept it. This is because iterating quickly on small blocks of work instead of taking a monolithic approach allows you to learn quickly and adapt to changing circumstances. Better to discover an approach is a dead-end after a week of design time, instead of burning three months of budget.
We try to get working Edtech prototypes in front of people as early as possible. These prototypes can be as low fidelity as a clickable mock-up in a design package such as Figma, or higher fidelity like a prototype app using mocked up data. The key is that they are built to be thrown away, and we don't become too attached to an approach until it has been validated with real users.
Can you tell us more about the considerations assessment practitioners should be making to ensure tests are produced with the learner in mind and not the machine?
Ironically, given my line of work, I have never felt I performed well under pressure so am relieved I no longer need to worry about exams! And I also wonder how valid it is to test memory, rather than ‘open book’ assessments for example, especially in a digital age where we have smart devices which supply us with answers to our many questions… ‘hey alexa/google… what’s the…?’!
All of this has certainly spurred me on in my work in innovation where we put the user/candidate at the centre of their learning and therefore their assessments. We have talked about personalisation in education for many years; I see digital assessment as being a viable opportunity in educational personalisation: carefully implemented digital innovation offers the opportunity to improve exam accessibility, ensuring all users have their needs met. This might be through screen contrast options, user interface and font scaling, voice activated software, and other inclusive design techniques.
Moreover, digital assessments can be supported by remote invigilation or ‘proctoring’ which means candidates may take the test when they choose and feel ready, rather than taking all linear assessments in one fell swoop. Candidates can instead sit in a place where they are comfortable and alone. We know, however, there are technical issues with remote invigilation; many companies and start-ups are continually working to improve this space. I imagine in a few more years the tech will be even better and further reaching.
In your ambassador profile you state: "I believe we must always see assessment and social justice as interrelated." Could you tell us a bit more about this?
JISC studies in the Higher Education (HE) and Further Education (FE) domain during the pandemic showed that many students in Higher
Education from diverse and underrepresented groups performed better during online lectures/assessments because they were in a safe space and not being singled out in classes where they already felt ‘different’. The return to classrooms saw these groups of students’ performance decline again. This point only covers elective learners obviously; lower down the K-12 system, we see families without access to Wi-Fi or laptops which will clearly hinder learning.
At RM we actually produced a robot called ‘Kitt’ (pictured right) who was aimed at reducing the impact of the digital divide for families without access to tech. Kitt would be taken home by primary school learners, they could read to the robot, record themselves, get homework reminders from it and even record videos and surf the internet. It’s still in prototyping but we were entered for a BETT award!
As we know many institutions are now looking at hybrid learning models to underpin post-pandemic courses, but what does this mean for assessment, and for those without access to hardware?
Well, many FE and HE centres no longer have unseen assessments, they are opting for ‘open-book’ models which are aimed at testing skills and application, not memory. They are also working more on ‘trust’ than ‘exam security’. Such innovations are yet to be brought into the high-stakes assessment arena in UK in any formal capacity. And we’re beginning to ask important questions about equivalency: will it do to move a paper-based assessment on-screen and leave it at that? Or do we need to innovate to make digital assessments match twenty-first century skills requirements? Are skills like coding, using software, cyber security, or being a fraud detective things which need to feature in syllabuses of the future?
The digital divide is incredibly real. I believe investment by tech giants, the Government and other industries, or institutions could help support improved access to digital technology by many social groups. Until this happens, there’s not much point in high stakes assessments becoming fully digital.
What do you enjoy in your spare time?
I publish my poetry and photography on a website I run for fun www.freative.org; I find it helps me cement more technological skills, especially when designing content and style that appeals to users.
Living in London allows me to visit galleries, the theatre and talks about literature and social issues which inspires ideas for my creative work and fulfil my need for ‘culture’! Visiting Freud’s house recently has evoked a new poem; something about the smell of dust in his study! I’m still working on it though!
Wherever possible I eat delicious food, enjoy wine, seeing films, clothes shopping for bargain vintage gear, travelling to new places and spending time discovering new things.
And I still find that my work in assessment, teaching and now in Edtech comes into so many everyday conversations I find myself in. It’s one of the joys of working in the education sector, not only are your colleagues so great and like-minded, but you’re always working on living topics - it never gets dull!"
Would you like to feature in a future Member spotlight for Perspectives in Assessment? We'd love to hear from you. Contact: thenetwork@cambridgeassessment.org.uk