This page contains a Flash digital edition of a book.
Viewpoints


@imveurope


www.imveurope.com


Machine vision under the microscope


Five industry experts give their opinion on five key areas in machine vision: embedded processing, 3D imaging, automotive production, traffic, and robot vision


Terry Arden, CEO of LMI Technologies, assesses which technique is winning the race for 3D vision in factory automation


T


he use of 3D technology in manufacturing is growing as the need for inspecting features related to shape


(angles between surfaces, depth of a groove, diameters of a countersunk hole, for example) are required steps toward precise assembly of a final product. It took 20 years for the wide adoption of 2D machine vision in factory automation processes in which part features related to contrast are now commonly inspected (fiducial markings on a PCB, distances between edges on a part, correct label of a barcode, etc). Te maturity of 2D has set up the next decade for 3D adoption – where the combination of shape (3D) with texture (2D) tells everything that needs to be known to drive fully automated manufacturing producing high quality products. Te main activity today is in two areas:


firstly, 3D smart sensor solutions for factory automation handling more of the overall inspection need, and, secondly, 3D scanning solutions for vertical markets such as wood, road, metals, dental, or automotive, where significant value-add is provided by an OEM, system integrator, or volume reseller. As part volumes increase in production and


parts shrink in size (requiring resolutions of a few microns), many industries – especially electronics and medical – are demanding high speed and micron-resolution sensing solutions that measure both 3D and 2D features. Which 3D technology will win over the need for speed? Which will deliver high resolution?


Is there a single approach that can deliver both 2D and 3D – offering both speed and resolution?


Triangulation vs fringe projection Tere are two predominant 3D measurement approaches suitable for factory automation today. Te first is laser triangulation where a camera observes a reflected laser line projected onto the surface of a part to produce a contour. As the part moves under this system, a 3D point cloud is generated representing part shape. Te second is structured light based on


fringe projection, where a camera observes a sequence of light patterns projected onto the surface of a part using LEDs instead of lasers. Te part is stationary while the grey code and phase pattern projections are collected, from which a 3D point cloud is then computed. Calibrating the camera and laser line or


LED projector delivers a measurement device that is widely used in factory automation to ensure parts are manufactured accurately – oſten by verifying surface quality (for example, no scratches) and assembly feature location accuracy (such as the position of a threaded hole). Part surfaces that mate together are further inspected for flush and gap. Laser line triangulation


once. Projector light intensity and pattern sequencing determine cycle times and how long a part must be at rest. Parts cannot move during acquisition time with this approach. Today’s cycle times can take up to several seconds per part. Currently, laser based triangulation sensors


are winning the race primarily because high- speed camera chips running at thousands of frames per second are available with sufficient sensitivity to work with eye-safe laser power (<1mW). To handle 2D, dedicated cameras, lighting, and soſtware are added to build a hybrid inspection solution. Te present situation leads to complexity and high cost, which ultimately slows adoption. In factory automation, laser line profiling sensors are still at an early adoption stage – used only where there is a pressing need. But has the 3D race really started? Imagine


This direction leads us to an Industry 4.0 scenario driven by robots and smart sensors


provides a way to scan very fast moving objects, oſten in environments lit with ambient light. However, this technology is limited by laser line width, which determines resolution along the movement axis, and produces laser speckle that creates noisy data at micron resolutions. High accuracy encoders and stable motion systems are also needed and these drive higher cost. In addition, laser safety issues can delay projects. Fringe projection offers very high resolution (sub-micron) and captures the entire part at


12 Imaging and Machine Vision Europe • Yearbook 2015/2016


that these same fast camera chips are used to accelerate structured light scanners to deliver full part shape and texture at intelligent resolutions – higher for small features, lower for larger features. New factory designs will begin adopting robot part handling that move parts in and out of inspection areas and carry out precise assembly, eliminating non- conforming parts along the


way. By combining 2D image acquisition with 3D pattern projection, a universal solution is delivered at a low overall system complexity. In this scenario, process engineers use such a smart scanner to visualise parts in high definition, selecting regions of interest, adding measurements and tolerances then go live for fully automated operation. Tis direction leads us to an ‘Industry 4.0’ scenario relying on robots and smart sensors to achieve automated factories driven by 3D part models – the next revolution for industry.


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48