search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
ARTIFICIAL INTELLIGENCE


Translating user experience into AI edge imaging


An iterative approach is needed to build a real-world AI vision application on embedded hardware, say Irida Labs’ Vassilis Tsagaris and Dimitris Kastaniotis


D


eep learning models have been proven to achieve human levels


of performance in many tasks and are nowadays the backbone of computer vision products. Tese models are very good at solving well-defined tasks with enough data, but in real-world applications it’s very unlikely that a task will have been defined from scratch. Additionally, the


computational capacity is bounded by hardware cost and power consumption, critical factors for mass deployment of products. Finally, decisions made by deep learning models are hard to interpret, and there is no mechanism to translate user experience into rules that would improve the model’s decisions according to the customer’s needs. Computer vision is a powerful


tool for products and services for daily life. Most of these applications are deployed on edge devices under the Internet of Tings umbrella. However, IoT concepts are


mostly designed to support the deployment of fixed functionalities like sensors, but not computer vision algorithms, which by default are entities that need to evolve over their lifetime. In order to close the gap


between writing computer vision algorithms and deploying them on IoT devices, Irida Labs has developed an approach where user experience helps shape the end-product, including model performance and hardware selection.


User experience A key concept of the Irida Labs approach is to rely on an agile development process that allows the customer to modify or adapt the definition of the task. Indeed, in most computer vision products, the customer tries to summarise the task in very abstract ways, like ‘people detection’. Tis description is usually


followed by a small test set, which is not representative of the underlying data distribution


22 IMAGING AND MACHINE VISION EUROPE FEBRUARY/MARCH 2020


‘Working with highly pruned machine learning models that operate with low- bit accuracy is the selection of choice’


of a real-world deployment. Tis is a weak definition of the task as it does not consider factors like the viewing angle, lighting, whether it’s deployed indoors or outside, and weather conditions. In order to overcome this


problem, the deployment of the device is connected to model development and, in particular, to data management and user experience. Te main goal of this approach is to translate user experience into a representative test set. Tis is implemented by a software mechanism for receiving user feedback. Te feedback comes from the real-world deployment of the


first version of the model and allows the model to learn what samples are important. In this manner, Irida Labs provides a deep learning model that’s a good starting point, and then, together with the customer, iterates feedback to improve the model.


Data management and model development Te first version of the computer vision model uses data that matches the description of the task and the first test set of data. Subsequent releases, however, will require more application- specific data – the true challenge is in collecting the appropriate data.


Te process of gathering


data involves a number of mechanisms that can affect the model’s performance. For example, data sampling might introduce biases that are hard to detect. To overcome these pitfalls, Irida Labs uses a development process that incorporates some kind of intelligence and minimises


@imveurope | www.imveurope.com


Metamorworks/Shutterstock.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40