search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
ARTIFICIAL INTELLIGENCE FEATURE SEEING IS BELIIEVING


Developing multicamera vis for autonomous veh


As self-automated technology becomes increasingly comple required to accurately manage the data for this technology


demanded of it. Michaël Uyttersprot, market segment mana discusses the challenges and solutions to this dilemma


T


here’s a gap between the technology that today’s self-driving cars use to steer along well-lit, well-marked highway s, compared to driving along an unlit, unmarked country lane at dusk. Many companies are addressing this issue by developing embedded-vision


systems that can capture scene dat a and then apply machine-learning strategies l


to plan a vehicle’s route. hi l ’


Creating such systems requires a development toolchain that enables rapid experimentation, systemic optimisation and a fast route to commercialisation.


THE HARDWARE CHALLENGE One of the main hardware challenges is handling the very large amounts of dat a created by multiple cameras and sensors, as an advanced driver assistance syst em may have mult iple cameras, radar and lidar sensors, as well as in-vehicle cameras to sense driver alert ness. T herefore, autonomous vehicle vision syst ems must offer reliable and functional safety , real-time execution with low latency, minimal


power consumption, flexibility to work wit h different camera configurations and computat ionally intensive vision processing algorithms. i i


Vision systems for cars also need to achieve stringent aut omotive


f l d


qualifications, meaning t hat developers must ensure an easy rout e from the


systems upon which they validate their hardware and software choices to a qualified commercial system.


THE SOFTWARE CHALLENGE Software challenges include processing very large amounts of data and coordinat ing the work of heterogeneous multiprocessor sy stem architectures. Machine-learning algorithms are also evolving quickly, so t here must be room left for late changes to core code.


f


for autonomous vehicles is sensor fusion. This involves correlat ing inputs from multiple sensors of different t ypes,


A key task in developing vision sy stems hi l i


f i


often with very low latency, to correct deficiencies in individual sensors that enable more accurate position and orientation tracking.


Anot her major software task is to implement efficient machine-learning algorithms t o perform object recognition, classification and verificat ion. These machine-learning st rategies have


become more accessible t hanks to the steady rise in general computing power, t he development of graphics processing unit s (GPUs) and field programmable gate array s (FPGAs), as well as the


availability of large training datasets. A DEVELOPMENT SOLUTION


Creating a vision sy stem for autonomous vehicles means exploring trade- offs among the hardware, software,


computer-vision a aspects of a design solution is an emb using multiple cam design methodolo software environm to worked exampl


and machine-learning n. The development bedded vision system meras, which includes a gy , hardware platform, ment, drivers and access es.


Michaël Uyt ttersprot,


market segment manager of A.I. at Avnet Silica


ment manager net Silica


sion syst ems hicles


ex, the hardware and software must meet the intricacies ager of A.I. at


Avnet Silica, the MPSoC. The Zynq UltraScale+ h MPS C h Ult


The hardware archit ecture supports four, two megapixel cameras and uses a Xilinx Zynq UltraScale+ MP SoC, which has mult iple ARM Cortex-A53 processors and a programmable logic fabric. The cameras connect over coaxial cables t o a mult i-camera deserialiser on an FPGA mezzanine card (FMC), which is itself plugged int o a board that carries S l


MPSoCs are available wit h a variety of CPUs, GP Us and video codecs. Their programmable logic fabrics can be configured as hardware accelerators for computationally int ensive algorit hms.


THE DESIGN METHODOLOGY


The design met hodology is based on a set of software resources for developing algorithms, applicat ions and platforms. The development toolchain supports GStreamer, a multimedia framework for linking media-processing subsy stems to complex workflows. Its pipeline-


based design is well suited t o analysing streaming media data using machine- learning techniques.


resources effectively. T neural network development the output from algor toolkits such as Google’s T or Caffe as its input. T


The general architecture of the hardware/software orm outlined above


platfo


The programmable logic fabric on Xilinx MPSoCs can provide the computing power needed to run machine-learning algorit hms. Toolchains from companies, such as DeePhi, help users apply these ff i l


toolkit includes a graph compression tool, a deep neural net assembler and a neural net DeePhi has also developed hardware architectures optimise


The DeePhi deep opment kit uses ithm exploration e’s TensorFlow T he DeePhi ph compression twork compiler,


and image recognition, which can be instantiated on the programmable fabric. The toolchain includes simulat profilers for these architect


al net work runtime. oped hardware ed for video n, which can be ogrammable fabric. s simulators and hitectures.


Avnet Silica www.avnet.com


/ ELECTRONIC CS ELECTRONICS CTRONICS | APRIL 2019 15


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44