search.noResults

search.searching

note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
BIO-INSPIRED IMAGING


development and uptake of bio-inspired imaging, with Bamford explaining that a big push for Inivation has been the recent proposal of several good simultaneous and localised mapping (SLAM) algorithms that operate using the event-based information of dynamic vision sensors. Despite these enabling developments, certain


hindrances still prevent the full uptake of bio- inspired technologies, such as its large differences from standard vision that make it difficult to integrate into current machine vision environments. ‘Te hardware and system-on-chip integration


of bio-inspired sensing is still maturing, and this creates a barrier to entry in comparison to conventional active pixel imagers,’ confirmed Dr Yiannis Andreopoulos, one of the researchers leading efforts at University College London (UCL) as part of a collaborative project that began earlier this year to explore the bandwidth, delay and energy saving advantages of using bio-inspired vision in Internet of Tings (IoT) environments. ‘In addition,’ Andreopoulos continued, ‘[bio-


Bamford. ‘DAVIS, in particular, is very promising because of its low latency, high dynamic range and limited data stream.’ Additionally, the wide dynamic range of dynamic vision sensors makes them suitable for the varying lighting conditions of driving, such as when entering and exiting tunnels or travelling in different levels of sunlight. Companies such as Intel, Renault and Bosch have already invested in Chronocam, with automotive applications in mind. Inivation and its clients also see huge potential


The hardware and system-on- chip integration of bio-inspired sensing is still maturing


inspired] spike-based sensing is still incompatible with the way gradient-based machine learning works, such as deep convolutional or recurrent neural networks, because the latter require a regular space-time sampling grid. However, spike-based hardware for machine learning has been produced – or upgraded – recently, for example the IBM TrueNorth chip and the Intel Loihi chip. Tese are a natural fit for bio-inspired spike-based sensing, even though they are also based on gradient-based machine learning methods.’


Moving into the real world Te optimised data acquisition and processing of bio-inspired imaging has led it to be considered for a number of uses, such as autonomous driving, robotics, and medicine. ‘Te autonomous driving industry is converging


on lidar as a primary sensing method. However, a complementary visual solution is also needed,’ said


for dynamic vision sensors in robotic sensing and cooperation, as they can act as complementary safety systems for the robots by scanning the vicinity for people and obstacles. Te low latency and consistent performance of the sensors in uncontrolled lighting conditions will also be key here, according to Bamford. Industrial inspection and


identification are also potential


application areas for dynamic vision sensors, as their low power enables high-speed, real-time performance, whereas standard cameras, in comparison, require large amounts of power and processing to achieve the same result. In the medical field, Chronocam has already


highlighted how close its sensor technology lies to natural vision, as two years ago it used its own sensors with Pixium Vision to help restore the sight of blind people by connecting them to retinal implants. ‘We stimulate the brain using our sensor as an input device,’ Verre explained. ‘Tis is now commercialised around Europe for visually impaired people.’ Dynamic vision sensors will find their way into


day-to-day and industrial applications in the future, according to Bamford, with chips smaller and more advanced. ‘We see clear pathways to making the sensors even better than they already are, improving latency, data rate bandwidth and frequency response,’ he said. ‘We’re going to be able to realise the vision of very low power end-to-end systems.’


The pixel-level events recorded by Inivation’s DAVIS640 dynamic vision sensor while viewing eye movement


The internet of silicon retinas In July the Internet of Silicon Retinas (IoSiRe) project started, a collaborative £1.4 million effort funded by the UK’s Engineering and Physical Science Research Council (EPSRC) involving UCL, King’s College London and Kingston University. Te project aims to explore how bio-inspired vision could be used in conjunction with cloud-based analytics in order to achieve state-of-the-art results in image analysis, image super-resolution and in a variety of other image and vision computing systems for IoT-oriented deployments in the next 10 to 20 years. ‘Essentially, the goals of the project combine


the consortium’s unique expertise in sensing, machine learning and signal processing, and communications, in order to quantify the energy and latency savings in comparison to what is done today with conventional frame-based sensing, processing and transmission,’ explained Andreopoulos, of UCL. Within the project, the consortium will be using


Inivation’s Dynamic Vision Sensor among other low-power spike-based visual sensing hardware and combining it with low-power processing to create compact representations that can be processed using deep neural networks. ‘At the moment, converting spike-based


sensing to representations appropriate for deep convolutional or recurrent neural networks is an open problem,’ said Andreopoulos. ‘If end-to-end resources are to be considered –


sensing, on-board processing, and transmission to multiple cloud-based Docker container services – creating a framework that can provide for scalable resource tuning is a very interesting problem.’ Industrial partners of the project have expressed


Comparison of a video frame with an Inivation DAVIS camera taken as part of the IoSiRe project. Green/red points correspond to +1/-1 (on/off) spike polarity


16 Imaging and Machine Vision Europe • December 2017/January 2018


interest in a wide range of applications, from cognitive robotics to smart home and smart environmental sensing at very low power and latency. ‘We would be interested to learn more about industrial use cases, as this is definitely something we are keen to explore in the project and beyond,’ concluded Andreopoulos. O


@imveurope www.imveurope.com


Andreopoulos et al


Inivation


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44