search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
VISION AWARD g


directions. An FPGA in the camera calculates an albedo image (conventional 2D image of the surface) and a gradient image representing the surface normal vector at each acquired surface point. Te output of the FPGA can be used for inline 3D surface inspection. Furthermore, the processing can serve as a pre-processing step to supply classical, or AI-based object classifiers, with rich photometric stereo data of the objects’ surface, to make the classifiers more discriminative. Xposure:Photometry combines 2D


high-speed capture with on-camera 3D surface capture which, to AIT’s knowledge, is currently not on the market. When 1D surface gradients are enough, the system can reach 300kHz acquisition speed using two light directions. When full 2D surface gradients are required, 200kHz is reached using three light directions. Finally, when targeting optimal, 3D information of the surface structure, the system can run at 150kHz using four light directions. In its high-precision configuration,


Xposure:Photometry delivers lines with 2,048 pixels width at a line rate of 150kHz, and with a 300kHz line rate for a lower precision configuration. Potential applications include: wire inspection, where material defects on the surface of wires can be identified as they are drawn at speeds of 100m/s; print inspection, such as inspecting passports, banknotes or other high-quality printed material; traffic infrastructure monitoring; and inspecting battery electrodes, which are made of a very dark material – Xposure:Photometry is sensitive to differences in greyscale value and therefore able to detect defects like scratches and pinholes on dark surfaces.


Prophesee’s evaluation kit


Prophesee Metavision technology: a comprehensive event- based sensing and software solution


By Christoph Posch, Prophesee


Te Metavision platform provides developers of machine vision applications with a complete solution to implement event-based vision in their systems. It is particularly well suited for applications in high-speed quality control, inspection and analytics, but has proven to have unique capabilities in other areas too, including to help restore or enhance vision in people with conditions that impair their sight. Event-based vision is a paradigm-shift


in imaging addressing the limitations of traditional frame-based cameras. It is based on how the human eye records and interprets visual inputs. Te sensors facilitate machine vision by recording changes in the scene, rather than recording the entire scene at regular intervals. Specific advantages over frame-based


approaches include better dynamic range (>120dB), reduced data generation (10x-1,000x less than conventional approaches) leading to lower transfer or processing requirements, and higher temporal resolution (microsecond time resolution – that is, >10k images per second time resolution equivalent). Other advantages are low-light imaging, down to 0.08 lx, and power efficiency, with just 3nW per event and 26mW at sensor level. At the core of the innovation is the


Metavision sensor, a thirdigeneration 640 x 480 VGA event-based sensor. Inspired by the


g


Easy pick-and-place for complex objects combining light field technology with artificial intelligence


By Dr Christoph Garbe, HD Vision Systems


LumiScan Object Handling version two and its 3D sensor, LumiScanX, provide an industrial approach to light field 3D imaging. Te system is able to image glossy or shiny parts, while also reducing occlusions on objects. Tus, the light field sensor is well suited to inspecting complex objects, such as forged parts, semi-transparent pieces or blunt plastic. At the same time, the compact


multi-camera-array consisting of 13 lenses provides more precise information than conventional 3D imaging methods. On the software side, pre-configured


and pre-trained neural networks are used for the vision task. Te user can fine-tune the algorithm on labelled images of their parts with the help of intuitive software. If a workpiece has to be picked up in one way and put down upright, for example, staff can simply mark these areas via a drag-and- drop tool in the software. After training, the software learns to locate the objects in the image and determines possible grip points. Tis information is then used to calculate and send passive way points to a connected PLC or robot controller. In addition, through LumiScan Object


Handling, the sensor can be installed and calibrated in less than two minutes.


6 IMAGING AND MACHINE VISION EUROPE AUGUST/SEPTEMBER 2021 Complex objects can now be fully


automated with robot pick-and-place using LumiScan Object Handling. Te solution can be used for a variety of object detection and object handling tasks in manufacturing. Alongside numerous applications for


‘The light field sensor is well suited to inspecting complex objects’


forged parts in automotive, LumiScan Object Handling version two is also able to automate picking and clearing unsorted bagged goods. Here, the system not only detects the form of unstable bags and their orientation, but also delivers the corresponding information to the robot controller. Te bags can then be placed accordingly into the machine and cut open.


@imveurope | www.imveurope.com


Prophesee


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44