Read more on autonomous vehicles online at
https://goo.gl/qhhdfB or scan the QR code
autonomy race
❱ ❱ Daimler concludes its World Autonomous Drive with the challenges of US roads
always been associated with the technology.
Shorter-range LiDAR sensors are positioned low on all four sides of the vehicle – one in each front quarter panel and one each on the front and rear bumpers. These can detect low-level and smaller objects near the car like children and debris in the roadway. Toyota has gone one step further by incorporating the LiDAR systems into the coachwork design to eliminate the “spinning bucket” appearance that’s
READING THE DRIVER’S MIND N
issan has gone outside the boundaries of isolated autonomy by bringing driver
perception into consideration with technology that can be used to help develop autonomous control algorithms or can be used in standard ADAS equipped vehicles. Brain-to-Vehicle (B2V) technology
potentially provides the car with the ability to adapt to more human-like control. Based on the notion that machine perception can’t match that of the human brain, Nissan is now reading the drivers’ minds to see how they react to danger, visual input and other stimuli. According to Nissan Executive Vice President Daniele Schillaci, “When
AUTONOMOUS WORLD TOUR Daimler used CES to punctuate the end of its “Autonomous World Tour” conducted on all five continents during the last 12 months in a modified Mercedes S-Class. Daimler described the USA as a fitting end to the tour, offering the kind of road conditions that are a significant challenge to the sensors and control systems on driverless cars. With the need to recognise school buses and their flashing light codes as well as road signs and lane markings that vary from state to state and different styles of high-occupancy lane identifiers, driving automation features need to have the ability to adapt to different conditions within the boundaries of a single country.
ENABLING TECHNOLOGY As well as major automotive manufacturers, CES was attended by companies without which the car makers would struggle to meet the demands of autonomy. Image sensing and miniaturisation
were key themes for technology giants such as Intel and ON Semiconductor, both of whom were at the show to demonstrate their ability to squeeze ever-increasing perceptive power into smaller packages. The ON Semiconductor range of CMOS image sensors has been modified to generate greater pixel density to identify finer detail combined with low- level light performance and high dynamic range to enable the sensor to cope with rapidly changing light conditions, such as the strobe effect of flashes of sunlight through an avenue of trees. To meet the restrictive space requirements, ON Semiconductor has developed new wafer-stacking technology to reduce the package size. The Mobileye wing of Intel has
developed a System on Chip (SoC) package for use with CMOS sensors to handle the vast amounts of data travelling from the sensors to the vehicle control system, a task that needs to be performed at extremely low latency and without packet loss to enable safe autonomous control – a massive challenge to the industry and one that Intel is actively engaged in with partner motor manufacturers, including BMW, Nissan and Volkswagen.
❱ ❱ Nissan places sensors on the driver to read brain output for predictive control
most people think about autonomous driving, they have a very impersonal vision of the future, where humans relinquish control to the machines. Yet B2V technology does the opposite, by using signals from their own brain to make the drive even more exciting and enjoyable.”
AI ENABLED LIDAR One company, AEye, is taking LiDAR sensing technology a stage further with the introduction of artificial intelligence to the equation. iDAR (Intelligent Detection and Ranging) is a robotic perception system, which allows sensors to mimic the visual cortex – bringing real-time intelligence to data collection. As a result, the system not only captures everything in a scene – it actually brings higher resolution to key objects and exceeds industry required speeds and distances. “By solving the limitations of first
generation LiDAR-only products, AEye is enabling the safe, timely rollout of failsafe commercial autonomous vehicles,” says Luis Dussan, founder and CEO of AEye. EE
DAQ, Sensors & Instrumentation
Vol 1 No. 1 /// 3
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24