search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
TECHNOLOGY & DATA


Making the most of AI without accident


The potential for artificial intelligence (AI) to give cleaning businesses a competitive edge is very real, but there are risks too, especially for personal data. Luke Dixon, Head of Data and Information at Freeths, outlines the opportunities and how to avoid the pitfalls when using AI in your business.


AI can improve the performance of robotic cleaning machines, making them more autonomous and efficient. AI has great potential in terms of sales and marketing. For example, crunching sales data to have a focused outreach. AI can guide social media marketing campaigns to target the most fruitful leads, later reviewing the outcome, further refining the process.


AI can improve customer relationships by offering 24/7 support through chatbots. All of these examples collect and process data which you have to handle carefully – there have been concerns about the data that robotic cleaning machines capture and how secure it is.


Protecting data when using AI:


1. Involve relevant stakeholders Before you roll out an AI opportunity, you need to get a number of people onboard, including legal counsel and compliance, so you can take advantage of the benefits and use appropriate protections.


2. Know your responsibilities


AI systems are likely to include lots of personal data and will be regulated by the UK general data protection regulation (GDPR). Whether you’re a controller or a processor of the data will determine your responsibilities under UK GDPR.


3. Be clear about lawful basis


Controllers need to know the lawful basis for processing data. GDPR sets out six options including performance of a contract, consent and legitimate interest. Knowing where you sit within the AI process matters – the selection of the lawful basis depends on it.


4. Minimise data processed


GDPR requires you to only process data you need. When you’re feeding in training data sets, think about how much data you really need. The more data you’ve got, the more is susceptible to a data breach.


5. Consider a Data Protection Impact


Assessment (DPIA) A DPIA brings stakeholders together to identify the risks in processing data and to work out ways of mitigating that risk. A DPIA helps to get the benefits out of a system without incurring too much risk.


42 | TOMORROW'S CLEANING 6. Tell your customers


Where you’re going to be using AI systems, you need to provide your customers (or the data subjects) with privacy notice information, which informs them about how you’ll be using their data, including whether you’ll be sharing it or processing it for a given purpose, and that you’ll be doing it on a lawful basis.


7. Think about data rights


GDPR gives people access and rectification regarding personal data. You’ll need to work out how you can respond to data subject access requests and requests for correction, and what impact an erasure request might have on your system.


8. Keep systems secure


For each AI system you have, think about the data inputted, the level of risk if that data is no longer secure and how secure you can make it. An AI system might be higher risk than other software because there might be more third- party code in the system, longer software supply chains and new types of attack.


9. Be clear about chatbots


Chatbots can make your company more available to customers, but it must be clear when customers are talking to one by using appropriate legal disclaimers. Think about where the liability lies for a chatbot that says something that it shouldn’t.


10. Understand the limitations


AI systems are only as good as the training data they’re based on. If that data is biased or inaccurate, the outputs might be biased, unfair or inaccurate.


For advice on your responsibilities in terms of data and information when using AI, contact Luke Dixon at Freeths.


www.freeths.co.uk


twitter.com/TomoCleaning


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70