search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
EMBEDDED VISION


Edge imaging driven by AI, say Embedded World experts


Greg Blackman reports from the Embedded World show, where a panel gave insights into vision processing at the edge


D


iscussions around embedded vision tend to turn to AI at some point, and the panel session during


the Embedded World digital show was no different. Customers often want to add AI functionality to a vision system, explained Arndt Bake, chief marketing officer at Basler, during the discussion. However, AI doesn’t run well on a PC, so customers are looking for hardware where AI inferences do run well, he said – namely embedded architectures. Bake was joined on the panel by Olaf


Munkelt, managing director of MVTec Software; Fredrik Nilsson, head of the machine vision business unit at Sick; and


Austin Ashe, head of strategic partners and channels, IoT devices at Amazon Web Services (AWS). Te session was organised by VDMA Machine Vision along with Messe Nürnberg. Bake added that ‘customers are struggling


somewhat with the stretch of changing from the classic architecture to the new [embedded] architecture. We need to enable customers to take small steps... to migrate from one technology base to the other.’ Basler is partnering with AWS to provide


vision solutions using the latter’s Lookout for Vision service. It allows customers to train machine learning models in the cloud that can then be deployed on edge devices, such as camera boards from Basler. Lookout for Vision is designed to make


it easier to use machine learning, targeting manufacturing and the industrial internet of things.


Ashe commented during the discussion


that AWS is trying to lower the barrier to entry for customers wanting to use embedded vision for the first time, or those wanting to expand and scale it. He said that 75 per cent


22 IMAGING AND MACHINE VISION EUROPE APRIL/MAY 2021


of businesses plan to move from pilot to full operational implementations of embedded systems over the next two to five years. ‘We are positioning ourselves to


orchestrate the edge and the cloud in a unique way,’ he said, adding that edge processing is important when it comes to latency, bandwidth, cost of sending data, and security and privacy. Te cloud comes into play for things like


monitoring devices, and also updating them. ‘When you think about managing embedded vision systems at scale, there is an elegance that comes with being able to take a [neural network] model, train it in the cloud, and deploy it over the air to all of the machines that need it,’ Ashe said. He added that the adoption of 5G ‘creates a whole new opportunity for cloud and edge to have closer interoperability and more edge- to-cloud use cases to be delivered.’ In terms of using AI, Munkelt and Nilsson


agreed that AI adds value and opens possibilities for using vision, but that it has to be easier to use. ‘We have to enable customers of embedded vision to quickly get


@imveurope | www.imveurope.com


Shutterstock.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36