Member spotlight: Adapting English assessments in Australia - a conversation with Cara Dinneen

Cara Dinneen

Interview with Cara Dinneen, Education Manager at Macquarie University College in Sydney, Australia

Cara Dinneen is the Education Manager for English Language Programs and the English Medium of Instruction Centre at Macquarie University College in Sydney, Australia. We spoke with her about her about adapting language assessments to reflect the interactive and tool-supported nature of modern academic environments, the opportunities and challenges afforded by Generative AI, and the importance of developing a community in assessment professional development. 

Why is knowing about assessment important in your role? 

"I work primarily in the pathway space, where we help students prepare for university studies. Much of our language and assessment work focuses on English for academic purposes. In my role, I oversee all learning and assessment across our curricula, making it crucial for me to stay ahead of any changes in language assessment. While change in the learning and teaching space can feel rapid, assessment often takes the longest to evolve, as it is steeped in tradition.

Even before the advent of Gen AI, the 'target language use domain' (the context or situation(s) where the test taker will use the language after completing the test) in the university learning and teaching space was becoming increasingly interactive. This means that if we want to assess students with the language skills they will need for further studies, we must start adapting our assessments and rethinking how we conduct them.

There has been a noticeable shift away from the traditional long-form essay and research essay in academic writing. These genres of academic writing still exist, but students now have many tools to support their structure. We need to return to basics in some aspects of language use. We want students to be confident in their language abilities, so that when they use the available tools at university, they have enough language knowledge and awareness to use them effectively.

I think as professionals, the more we learn about assessment and connect with others practicing language and various types of assessment, the better it will be for everyone. And  knowing more about assessment helps us better align with student learning outcomes. We all aim to achieve the best possible outcomes for our students, and increasing our knowledge of assessment helps us to do that more effectively."

Has membership helped you with this? 

"Being part of a network like The Assessment Network at Cambridge is crucial. It helps you stay on top of changes and connects you with others who manage assessment challenges. Connecting to an assessment community is key to keeping up with innovations in language assessment. We can't do it alone, but by networking and understanding the issues and innovations within our own centers, we stay ahead.

I am also the national convener for the English Australia Assessment Special Interest Group. We hold webinars and events to discuss issues and resolutions that arise in assessment. Access to The Assessment Network has been an invaluable resource for meeting the needs of people within the special interest group."

Were there any courses you’ve found particularly useful? 

"I recently completed A103, an introduction to data literacy, to pick up tips and tricks for the workspace. I felt I needed to become more fluent with statistics and data interpretation because quality assurance requires us to be highly analytical about student results. Our decisions must be data-driven.

The data literacy course helped me become more confident in unpacking information within graphs and understanding the statistical lingo. It was probably one of the best courses I've taken in terms of immediate takeaways. For example, I learned that when dealing with large amounts of data in a spreadsheet, filtering to get the desired data is essential. Previously, I ran pivot tables directly from the streaming data, which could be prone to errors and data corruption. I discovered that copying and pasting the streamed data onto a new tab before running my analysis was a simple yet effective solution.

Really simple stuff, but something that I wasn't aware of. So that's been a really quick takeaway for me. But really now I’d like to go back and spend some time going over the course content to think about how I can apply it further."

What do you enjoy most about your work?

"I love the scope for creativity and collaboration that I have in my role. I do a lot of program design planning for English Medium Instruction teacher training, I prepare presentations and seminars for multiple teaching groups, I manage professional development for teams of teachers and of course, I am instrumental in assessment design for our EAP (English for Academic Purposes) programs. I work with different teams of people to accomplish all of this and it makes work life very dynamic."

What do you see as opportunities in the assessment space now and in the future?

"I think that in the EAP space, there is opportunity for rethinking the way we assess language and what we prioritise as important. Traditionally, we have focused a great deal on assessing written output. Today, students have the tools to support their written output when they get to university level and when they get into the workplace. That’s not going to change, so how valuable is the personal experience essay on the ‘pros and cons of studying abroad’ in gauging whether a student is ready for further study? I would say, not very valuable at all. It’s more beneficial for students to focus on integrated tasks that requires them to draw on written and spoken texts to respond to essay prompts in the way they will do in higher education. And its beneficial to teach students how to use Generative AI to provide feedback on their written texts. The challenge lies in helping students to understand why getting the tool to write the text for them is not useful, while using the tool to provide them with feedback is very useful. 

Another area of contemplation for me lies in the advent of summary and comprehension tools like NotebookLM. They’re quite revolutionary for all of us, students and professionals, working in the space where we’d like to assess the content of academic articles before we decide whether to read them deeply.  Tools like this take us beyond reading the abstract and scanning a few paragraphs to decide whether an article is worth investing more time in. They provide a compact summary in written or podcast form that allows for a solid understanding of the article. There are also comprehension questions to help ensure you’ve understood, and the tool can synthesize multiple articles. I haven’t investigated them enough yet to decide what sorts of nuance may be missed, but I wonder about how, or whether, the advent of these tools should shape our teaching and assessing of reading. It’s clear that critical reading skills are more important than ever, because students must be critical consumers of the outputs produced by GenAI tools.  So, I am considering designing some proof-of-concept tasks around comparing Gen AI summaries with original articles, analysing for bias or assumptions etc. Ultimately, it’s not the reading skills we are changing, just the task design that draws on these fundamental skills. And we shouldn’t be afraid to teach the students to use these tools. We are not teaching them to cheat, we are awakening them to the ‘pros and cons of Gen AI tools in learning’ – now there’s a new personal experience essay for the international proficiency tests!

One final area I’ll comment on is the increasing prevalence of the viva voce in higher education, in Australia, at any rate. Students are still being asked to write their essays and reports, but some academics are not marking the written output in the way that they used to. Instead, the grading is based on the student’s demonstration of understanding of the content within the essay which is assessed through a one-on-one interview. This is particularly challenging for EAL students and is an area we need to help them develop skills in. Micro assessment tasks that ask students to explain written work and defend opinions is useful." 

This discussion first appeared in Perspectives on Assessment, the The Assessment Network member newsletter, which features key voices from the assessment community along with other member-exclusive content. Would you like to feature in a future Member Spotlight? We'd love to hear from you - get in touch.

Assessment practitioner awards

Assessment practitioner awards

Our awards have been designed to recognise and demonstrate a commitment to developing your assessment expertise.

Build and develop your practice

Girl working at a laptop

Focusing on an important element of assessment design and practice, our assessment practitioner workshops are designed to fit seamlessly around your work commitments.