Feature: Navigation
Figure 6: Stitching LiDAR scans together results in distortions and is computationally expensive
solution that can be adapted to the needs of a wide range of industries and applications. Point One Navigation’s solution Polaris
provides developers with high-precision GPS signal corrections anywhere in the world without requiring them to manage their own base stations; see Figure 3. Any GPS receiver compatible with RTK can make centimeter-level corrections in under 5s, resulting in high-precision positioning data for applications that need speed and accuracy.
Precise location Point One Navigation’s correction solutions provide developers with the highest fl exibility for correcting GPS signals, delivering the fastest, highest-accuracy corrections at scale. As an illustration, in just one aſt ernoon,
the Point One Navigation team built a scaleable tech demo for globally georeferencing LiDAR scans without stitching, showing how accessible high- precision GNSS solutions can develop a universal reference frame for global mapping at scale. Despite advancements in geospatial
technology, translating local positioning information into globally georeferenced features remains a challenge for developers. T e need to stitch together disparate LiDAR scans into a cohesive map oſt en requires a tech stack for accurate data interpretation and referencing – a time-
consuming process for developers, who would rather build systems quickly and focus on their main objectives. At the same time, developers need to integrate HD maps to stay competitive and resolve some of the world’s most complex problems. T is dynamic is forcing many to choose between quality and effi ciency as they build products and applications with LiDAR data. Local referencing of LiDAR data is
suffi cient for mapping tools where ‘close enough’ is suffi cient, such as consumer navigation applications that help individuals fi nd directions to, for example, a store front. But this approximation is inadequate for producing HD maps that can power exciting new developments in autonomous vehicles, robotics and other systems that require high precision; see Figure 4. For vehicles and other applications to navigate safely and accurately in the real world, they must be built with highly accurate, seamless, globally-referenced data. Any developer working on these types of
projects knows that LiDAR point clouds of the same area captured by diff erent sensors will produce disjointed data that requires stitching together; see Figures 5-6. T e data captured by these disparate sensors is only usable aſt er fi rst being manually assembled. Only then can developers begin creating a complex suite of hardware, soſt ware and algorithms that align the data points to each other (local referencing) and the
54 July/August 2024
www.electronicsworld.co.uk
rest of the world (global referencing), resulting in a unifi ed reference dataset that precisely represents the physical world. Using traditional methods to stitch
together just two disparately captured LiDAR point clouds is resource intensive, but nowhere near the eff ort needed to produce HD maps at the scales required by global solutions. If stitching together only two scans takes ten minutes, the resources needed to create a map of an entire city is exponentially more demanding. As high-precision maps and LiDAR
data increasingly fuel complex solutions at global scales, developers need a tech stack that allows them to build accurate maps, quickly and effi ciently.
A new tech stack Recent advancements in GNSS solutions off er all the benefi ts but without the need for stitching disparate sensor data. With highly-accurate location data on a Universal Frame, scans can be placed in 3D space, without requiring complex stitching algorithms. To demonstrate how developers can
use this high-precision data to globally georeference LiDAR point clouds and produce a seamless reference frame, the computer vision team at Point One Navigation developed a scaleable tech stack, with the following components:
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68 |
Page 69 |
Page 70 |
Page 71 |
Page 72