search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
SENSORS


The artifacts in 3D Time-of-Flight: What are they and how to avoid them?


Time-of-Flight (ToF) is an active 3D technology that calculates distance by measuring the time a light pulse, emitted by the camera, takes to return after being reflected off an object. Knowing this time and the speed of light, it is possible to calculate the distance traveled by the light pulse, and therefore the distance of the object.


C


ompared to other 3D techniques, ToF has several advantages: it can work reliably in low-light conditions, with texture- less environments and the computational load required to calculate the distance image is low. Additionally, ToF is suitable for applications requiring real time decisions because of its very low latency. However, this technology can also suffer from artifacts, which need careful consideration in order to mitigate or even avoid them.


The motion artifact


ToF systems require multiple phases (images) to calculate object distance, typically at least two, three when compensating for ambient light, and sometimes four or more. Then, if the sensor used to capture the phases is only able to capture one phase at a time and there are moving objects in the scene, the different objects won’t appear at the same location in the different phases, leading to motion artifacts when the phases are combined to compute the distance image. This will cause the objects to appear both misshapen and at the wrong distance. If the application requires several captures to manage HDR, a similar phenomenon will happen. The use of multi-tap sensors (ideally with the number of taps equal to the number of required phases in the algorithm) combined with non-destructive readout HDR will solve this artifact issue.


The aliasing artifact


This artifact occurs when the time of flight is longer than the light’s emission period. In such cases, the sensor may receive light from a different modulation cycle and cannot distinguish which cycle it belongs to. As a result, objects at distances of “X + modulo the distance corresponding to the emission period of the light” will be seen at the same distance “X”, causing inaccuracies in the 3D depth map or the associated


14


point cloud. Increasing the period raises the modulo distance but decreases measurement precision, so a trade-off is required.


Using multi-frequency is a solution for aliasing artifacts, but it will worsen the motion artifact. However, the use of sensors with a flexible duty cycle can solve this issue by decorrelating the maximum distance (related to the light’s emission period) and the disambiguity.


The interferences artifact This artifact happens when several cameras are working concurrently, and the light emitted by one system can be received by another system, which will distort the information received by the phases, leading to erroneous computation and incorrect distance measurements.


Typically, this artifact is avoided by time- multiplexing, but apart from being complex to implement, it requires (normally) wired connection between the different systems. On the other hand, the use of sensors or systems with specific multi-systems management features will help to mitigate this artifact.


FEBRUARY 2026 | ELECTRONICS FOR ENGINEERS


3D Time-of-Flight technology is increasingly used in industrial applications such as robotics, logistics, inspection or dimensioning, since it provides good precision and accuracy combined with high frame rates and low latency. Its flexibility enables use across a wide range of applications, even in complex environments, differentiating it from competing technologies.


As with any technology, some trade-offs are required, and for each pro there can be a con. However, some cons, such as these artifacts, must be avoided because incorrect distance measurements can lead to collisions with AMR or to injuries with cobots.


In ToF technology, the image sensor plays an even more critical role than in other 3D systems. Using sensors such as Teledyne e2v’s Hydra3D+, designed specifically for ToF applications, ensures that the final system will be robust to artifacts such as the ones mentioned above. In addition, it offers benefits including improved performance in outdoor environments thanks to its patented HiRho technology.


Picture credit: istock


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50