NewsEducation

How educators can outsmart AI in the assessment stakes

Deploying some innovative strategies can make it harder for students to pass off polished prose as their own work.

Oh, what a difference a year can make. Back in November 2022, when the generative AI engine ChatGPT was unleashed on the world, some teachers took the wildly optimistic view that they’d be able to bar students from using it to help them with – read complete – their assignments and assessment tasks.

Read the latest print edition of School News HERE

Suffice to say, that’s not how things have worked out for institutions and educators, here in Australia and around the world.

Instead, the education community is learning to live with this transformative technology – and using it to reduce workloads and enhance teaching processes and practices.

November 2023 saw the launch of a national framework guiding the use of AI in Australian schools, which was endorsed unanimously by federal and state education ministers.

Designed to support all stakeholders connected with education, including school leaders, teachers, parents and students themselves, the framework seeks to guide the responsible and ethical use of generative AI tools in ways that benefit students, schools and society.

Exploring the upside

There are indisputably benefits to be had. Generative tools such as ChatGPT can be utilised to adapt learning paths, provide personalised content and give students instant feedback.

Teachers can harness their power to help them design curricula, develop assessments, and rubrics to grade them, and create intervention strategies to support the individual and additional needs of their students.

Topic tests, questions and answers pertaining to specific blocks of text, outlines and summaries of core material – tick, tick, tick!

And the downside

On the flip side, generative AI tools can pose an existential threat to integrity and quality, if educators and institutions don’t prescribe clear parameters for their use, and police those parameters diligently.

In the absence of these ‘guardrails’, the technology can make it all too easy for students to plagiarise, by using auto generated texts for assignments and learning tasks, in place of their own work.

And then there are the social and ethical issues that can arise when the collection and synthesis of data is outsourced to an anonymous, artificial ‘brain’ – namely bias, data privacy and misinformation.

Visualization of Information. Digital Communication. In School Classroom Kids Students Studying by Computers. Technology Connecting Lines Flows into Global Network from Smartphones. Future Education.

Given the danger these phenomena can pose to social cohesion and stability, and to democracy itself, the onus must surely be on educators to inoculate the next generation of Australians, by schooling them in the ethical and productive use of AI.

Assessing with integrity

That means ensuring students develop the higher order skills – critical thinking, metacognition, creativity and problem solving – that AI has not yet succeeded in replicating.

Without them, they’ll struggle to critically analyse and evaluate the information they encounter or, rather, are bombarded with, when they access the digital platforms and search engines most of us turn to automatically when we want to know something.

Assessment processes which gauge students’ true knowledge and skills – and not merely their ability to interrogate a large language model – are vital.

Here are some approaches we’ve seen work well for our clients across the local education sector.

Individualised questions

In days of yore, it was common practice to require all students to answer the same assignment question but, in today’s times, taking this tack can make it all too easy for class mates to collude. Mixing it up, by setting individualised questions and requiring students to research and present on their problems and solutions makes it harder for them to utilise AI-generated text.

Case studies and scenarios

If you’re seeking generic content, AI is your friend. That’s why it pays to go super-specific; with case studies and theoretical scenarios for students to analyse and address. Business students, for example, might be asked to propose a strategy or recommendation for an enterprise facing a unique or very specific challenge.

Oral presentations and recordings

While AI may help students present a more polished paper than they’d otherwise be capable of creating, it’s of considerably less use when they’re put on the spot; forced to explain concepts and points in their own words, to a critical audience. That’s why requiring final submissions to come in the form of an oral presentation or video recording can help both educators and students achieve a gold star in education excellence.

Analysis of AI-generated texts

Immerse yourself in AI-generated content and you’ll fairly quickly start to see the limitations of the technology. Requiring students to do likewise and to critically analyse the accuracy of what they’re reading will give you a good gauge on their knowledge and understanding of the topic.

Getting smarter about AI

Like it or not, generative AI is here for good, and for all. It will continue to impact the teaching and learning processes and it’s on Australia’s army of educators to ensure it does so in a positive way. Getting smarter and more innovative about the way students’ knowledge and skills are tested is a great start.

Education and AIThis article was written by Brett Auton, K-12 Industry Lead for Atturra

School News

School News is not affiliated with any government agency, body or political party. We are an independently owned, family-operated magazine.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
SchoolNews - Australia