search.noResults

search.searching

note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
AUTOMOTIVE


a brightly lit sky’. At the time 130 million miles had been driven by Tesla vehicles with Autopilot activated; in October Elon Musk tweeted the figure had risen to 222 million miles. Since this time not one further fatality has occurred. Tesla is upgrading its systems to prevent these sorts of crashes happening again. Today, cars rolling off the production line will


likely have as standard autonomous emergency braking, lane departure warning systems, and active cruise control, as well as blind spot monitoring and automated parking. Tis is not just in the traditionally luxury brands, with companies such as Kia, which traditionally competes in the lower priced market, offering these too. Tese systems represent


parking applications, for example. Narayan Purohit of On Semiconductor’s


automotive image sensor division explained the three technologies: ‘Cameras are the master of classification; they are by far the cheapest and most available compared to lidar or radar. Tey can also see colour, which allows them to make the best of scenic information. ‘Lidar... is the master of 3D mapping. It can


If you have two or


level one (driver assistance warnings) and level two (partial automation of steering and/or speed) on the SAE automation scale. Te scale runs from zero (no automation) to five (fully automated).


A good mix of sensors Te experts interviewed for this feature all highlighted three technologies that will be used in autonomous vehicles: cameras, lidar and radar. Fusion’s Hutchinson also noted low-cost ultrasound technology, which could be used in


www.imveurope.com @imveurope


more systems – radar, lidar, visual – working together you’re going to get better results


scan more than 100 metres in all directions to generate reasonably precise 3D maps of the car surroundings. Radar is the master of motion measurement.’ Tere are significant


limitations in each of these technologies. A camera’s accuracy is severely affected by light conditions, by weather, by fast changes in light. Cameras also produce lots of data which places demands on


the car’s processing systems. Lidar – albeit significantly less so than


cameras – is also affected by weather. Te flash lidar variant is too power intensive to scale, while scanning lidar, which uses mechanically steered beams, is slower to scan and prone to failure. Radar is both less granular and slightly less responsive on angular response compared to lidar, but it works in all weather conditions. Bruce Welty of Locus Robotics, a developer of


autonomous logistics factory robots, commented: ‘If you add one more sensor to the mix, even if that sensor is quite inaccurate, it can substantially improve accuracy.’ He added: ‘If I want to maximise server time and [each of four servers] has an 80 per cent uptime, then [as] all you need is one server to be up, you can get incredibly high run times. Adding another server, even with just a 50 per cent uptime will [halve your downtime].’


Sensor fusion Dr Sim Bamford, CTO at neuromorphic camera technology developer Inivation, commented: ‘Te incident with Tesla made the point that no single system should be relied on. Tere should be [redundancy]. Vision can fail, radar can fail. If you have two or more systems – radar, lidar, visual – working together you’re going to get better results.’ Fusion Processing’s Hutchinson said: ‘For a


simple operation, in a very benign environment, you may not need many sensors, possibly only cameras, but that would limit what you can do safely. A more common configuration would be several cameras – including stereo cameras – [and] several [long- and short-range] radar units. Sometimes it’s useful to include certain sensors, for example ultrasound, for close proximity sensors.’ Hutchinson confirmed Fusion Processing’s


Cavstar autonomous vehicle also supports lidar, but the firm wasn’t using lidar in any live projects


June/July 2017 • Imaging and Machine Vision Europe 17


Chombosan/Shutterstock.com


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56