Feature: Machine vision
application of state-of-the-art deep-learning networks — particularly valuable in mobile robotics applications — to revolutionise the visualisation and interaction with the 3D environment.
Stitching captured data TOF cameras stand out as a high-performing group of imaging systems, using TOF techniques to determine the distance between a camera and each point in an image. Tis is achieved by measuring the round-trip time of an artificial light signal emitted by a laser or an LED. TOF cameras provide precise depth information, making them valuable tools for applications where accurate distance measurement and 3D visualisation are crucial, such as robotics and industrial technology applications, including collision detection and human detection over 270° field of view (FOV) for safety. Te ADTF3175 TOF sensor from Analog Devices achieves a
Enhancing vision sensor capabilities with a 3D
image stitching algorithm
By Rajesh Mahapatra, Senior Manager, Anil Sripadarao, Principal Engineer, and Swastik Mahapatra, Senior Engineer, all with Analog Devices (ADI)
T
he rising popularity of time of flight (TOF) cameras in industrial applications, particularly in robotics, is attributed to their depth computing and infrared (IR) imaging capabilities. Despite these advantages, the inherent complexity of the optical system oſten limits the field of view, restricting standalone functionality.
A 3D image stitching algorithm, designed to support a host processor, eliminates the need for cloud computation. Tis algorithm seamlessly combines IR and depth data from multiple TOF cameras in real time, producing a continuous, high-quality 3D image, with an expanded field of view beyond standalone units. Te stitched 3D data enables the
18 July/August 2025
www.electronicsworld.co.uk
calibrated 75° FOV. However, challenges arise when an application’s FOV exceeds this region, requiring multiple sensors. Integrating data from individual sensors to provide comprehensive analytics for the entire view can pose difficulties. One potential solution involves having sensors execute an algorithm on a partial FOV and transmitting the output to a host for collation. However, this approach can experience problems such as overlapping or dead zones and communication latencies, making it a complex problem to address effectively. An alternative approach involves stitching the captured data from
all sensors into a single image and then applying detection algorithms on the stitched image. Tis process can be offloaded to a separate host processor, relieving the sensor units from computational load and providing room for advanced analytics and other processing options. However, it’s important to note that traditional image stitching algorithms are inherently complex and can take up a significant portion of the host processor’s computational power. Furthermore, sending to and stitching in the cloud is not possible in many applications due to privacy reasons. ADI’s algorithmic solution can stitch the depth and IR images from
different sensors, using the point cloud projections of depth data. Tis involves transforming the captured data using camera extrinsic positions and projecting it back into 2D space, resulting in a single continuous image. Tis approach results in minimal computation, which helps
achieve real-time operating speeds on the edge, and ensures that the compute capacity of the host processor remains available for other tasks.
Description of a solution ADI’s 3D TOF solution operates in four stages (Figure 1): 1. Pre-processing IR and depth data: Time synchronisation and pre- processing of IR and depth data. 2. Projecting depth data into 3D point cloud: Utilises camera intrinsic parameters to project depth data into a 3D point cloud. 3. Transforming and merging of the points: Transforms points using the camera’s extrinsic positions and merges overlapping regions. 4. Projecting point cloud into a 2D image: Employs cylindrical projection to convert the point cloud back into a 2D image.
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44