search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
WHAT THE EXPERTS SAY....


AI IS AN ENABLER IN LEARNING ASSESSMENT – NOT A HINDRANCE


Comment by GRAINNE WATSON, Chief Operating Officer, RM


T


here’s no denying the rapid advancement of AI technologies has opened up new possibilities and aspirations for the education sector. As AI becomes more prevalent, questions arise about whether it’s just a buzzword, or whether it can kick start change and usher in a new era of learning and assessment.


Currently, the hopes for AI are huge. It could improve learner outcomes, increase productivity, drive cost efficiencies, support teachers and attract new learners. However, before the education sector decides to roll out AI far and wide, we wanted to ensure there is a concrete use case that shows that AI can augment human activity and enable productivity gains, through applications such as helping teachers grade work and helping students understand feedback. This means that AI must be thoroughly tested.


At RM, we set out to do just that. We know there will be a range of concerns by teachers, students, parents, schools and more about what AI can do and how it can enable better outcomes for learners, educators and accreditors. Our aim is to use AI to support and liberate teachers and educators, not replace them. To properly evaluate the technology, we focused on assessing AI’s marking effectiveness and its consistency when marking at scale. As ultimately we believe it can and will be used in summative assessment.


We conducted two Proof of Concept (PoC) projects, the first on the assessment of its proficiency in English as a foreign language and the second on summative assessment of English language skills in schools.


Our report resulted in three key findings: 1. AI marking is as effective as human marking


One of the big debates in the education sector about AI is whether it can really


reduce arduous and admin-heavy tasks such as marking. While it’s already been proven that AI marking is close to or on par with humans, our study was interested in evaluating its marking capabilities for subjective or long form answers. By analysing human variance during marking of each question versus AI variance we found that AI doesn’t just mark at the equivalent to human markers, but that it improves feedback quality – even when it is marking an essay. Teachers face huge time strains when it comes to assessing homework and assessments, so this result paves the way for major transformation of their workload. 2. AI marking is more consistent than human marking The effectiveness of AI is just one hurdle that needs to be overcome.


April 2025 areas for improvement faster.


At RM, we’re eager to work with our customers to tailor AI solutions to their assessment and qualification needs in ways that will improve learner outcomes and drive continuous improvement without introducing risk. That is why I am genuinely excited by the results of our Proof of Concept projects, which show there is a clear rationale for the education sector to responsibly harness the advantages AI can bring to the sector. To learn more about our PoC projects and how you can grasp innovation through AI for the education sector, read our full report here.


Assessments, exams, and homework need to be marked consistently so that every student receives fair and actionable feedback.


Our PoC projects wanted to test whether AI can identify the grade of a response as well as mark responses within that grade with more consistency than a human marker. The results demonstrate that AI marking is more consistent when compared to human marking.


One of the biggest factors affecting inconsistency with human marking was fatigue. With a growing workload and many other tasks to complete, such as lesson planning and extracurricular activities, teachers are often working longer hours.


Now that we know AI can grade assessments as consistently as human markers, we’re one step closer to removing barriers to AI adoption in education – unlocking greater scalability, efficiency and reliability in the assessment process.


3. AI marking enhances teacher support So if AI marking is effective and consistent, it is possible to reduce teacher workload and improve productivity in the classroom and other assessment contexts.


But, the final piece to the puzzle is that AI can also mark much more quickly than humans, resulting in more assessments being marked in a set period. Papers that took an experienced human marker 40 mins to grade, were completed in 5 seconds by multiple AI markers. For us, this allows for a level of consensus to be created ensuring that a paper could be marked multiple times before it then reaches a human meaning focus could shift to more difficult papers where human knowledge and skill is essential.


With any new technology, it is essential to have a clear view on the opportunities and benefits it can bring, alongside what safeguards will need to be put in place before we encourage mass adoption. Currently, an immediate benefit with implementing AI in the education sector is in the learning setting first i.e. formative assessment environment where it can help to reduce teacher workloads and train students on exam conditions and marking prior to them taking summative assessments. Additionally, it can ensure immediate feedback for learners, helping them identify


www.education-today.co.uk 27

Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48