search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
CONTRIBUTORS


AI can’t possibly do my job (can it?) – I’m an Early Years Teacher!


plans (IEP) for children taking account of existing interests and past experiences. AI will be able to identify body language, facial expressions and language in a way that is already being used by security services to combat terrorism. Watch the film Minority Report (2002) to see where this could lead. It doesn’t end well where predictive and pre-emptive intervention creates a ‘society free of all crime’.


However, as Early Years practitioners, we must reflect and question this because the premise of the developing child is, arguably, trying out lots of new and different things to begin to see what they like and have interest in. If an IEP only works on observation of photos, videos, or anything else, where and how does it generate artificially generated predictions as to what a child will enjoy and/or be good at in future.


Indeed, if


This month, in our ongoing collaboration with Edge Hill University curated by ALICIA BLANCO-BAYO, Early Years Lecturer and WTEY Programme Leader at the University’s Faculty of Education, we’re delighted to hear from MARTYN STARKEY, Year 2 Lead for the BA (Hons) Primary Early Years. Martyn has a wealth of experience in computing and music, using technology to record, publish and perform music with children on TV and the local radio.


You can’t have failed to notice how much media attention Artificial Intelligence (AI) has attracted in recent months due to how quickly this area is moving. In fact, the pace is already leaving many considering whether it is indeed a threat to jobs and the way we work in the future or a force for good that will help us do things better and find solutions to problems that are critical or time sensitive or that can’t be afforded in terms of time and money in the current global economy.


Long gone are the times where one might say that computers were only as good as the information or programming put in because, yes, computers can now write and generate their own code and fix it if it proves faulty.


For Early Years practitioners, our jobs are safe from this – aren’t they? We know each child individually, we know their background, desires, preferences, character which can help us predict their behaviour, choices, engagement and assess them along the way.


But AI is moving at such a pace that GPT-4* can already analyse images, photos, artwork and soon, elements of video. This will supposedly generate assessments, reports, and bespoke individual education


AI does this, what does that teach a child about perseverance and resilience when there is a need to engage with something they would rather not by choice. Can AI teach tolerance and respect of situations and people if it is only providing them with educational programmes they will enjoy and engage with? Surely, we must use our professional judgement to add these elements accordingly.


Will AI be able to adequately cater for and respond to children’s emotional needs and help them develop accordingly and in a way that is bespoke to them? It may appear to do this on the face of things, but AI is already under scrutiny for the imperfections it is putting out in relation to a variety of questions asked of it. An example where a US lawyer used AI to generate and cite laws for a particular case they were working on, were proven to be fictional and it was only the mercy of a particular judge that prevented them from being struck off for a lack of due diligence.


Imagine where one of the children for whom we created teaching provision was based on AI predictions from previous choices, games played, interactions with others and found to move a child down a particular line that they may never would have under normal circumstances. That child then decides later on in life that their interests were not served, and they take you or I and their Local Education Authority to court for compensation related to missed opportunities for discriminatory provision!


Children change. We all do. The whole premise of the developing child and their rights is to explore, play, discover, find out, change their minds, and find out over a long period of time what their preferences may be. If we allow AI to overly influence what we provide for our children then, based on their choices, it could advise


26 www.education-today.co.uk


that we don’t give them cabbage for tea, or based on previous choices to let them watch TV or an iPad all day because the algorithm is working on previous information. Think about when you go online shopping, and you are bombarded with the AI generated suggestions of what to buy because of what you bought before or people like you have bought since!


We, as practitioners, should care more about what we say and do. We and our young children might show an interest in something and, if we are not careful – could that something be overly promoted to the detriment of other activities and interests? This is why we need real people to interact with our children because the rights of our children, I would argue, must be taken seriously. Children’s educational provision should not just be based on their choice or interest either though, because resilience comes from challenge and trying new things as a journey of discovery – we can and should be travelling together.


We need to consider the rights of the child in terms of their safeguarding in what we allow AI to influence. Articles 3 and 12 remind us that we should remember that children’s opinions are to be taken seriously on decisions that affect them, not an algorithm determining what is ‘best’ for them. Children’s emotional development is also catered for under Article 27, but if AI reduces the need for Early Years practitioners to interact with children, don’t we think this might affect their mental well- being, as well as their moral and spiritual development or will these become almost non-existent as they were in the novel Brave New World?


A ‘brave new world’ is already here but, how we control and make it work for us will be key to the next generation and ultimately, what society we create. There will certainly be some disparity between those who want more automation and those who see governing of such issues as a paramount concern. It appears that we cannot control the Internet as it stands already, and this has cost some children and their families dearly because they were not able to be protected! But the premise of this article was whether AI could possibly do your job as an Early Years teacher and yes, it probably could – eventually! But whether we should allow it to do aspects of your job is a debate that will continue for some time.


* Generative Pre-trained Transformer 4 (GPT-4 March 2023) is a multimodal large language model created by OpenAI, and the fourth in its numbered “GPT-n” series


July/August 2023


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44