TECH FOCUS: GIGE VISION
GigE gives 3D lease of life
Greg Blackman speaks to Pleora’s James Falconer about the latest versions of GigE Vision
T
he number of 3D vision cameras and technologies is
proliferating, and therefore attention turns to how to transport 3D vision data in a standardised way. Te answer for most machine
vision interface standards is GenDC, or generic data container, a module within the latest version of GenICam that can be used to describe more complex data formats like 3D data. GenDC defines how image data is represented, transmitted or received independent of its format. Coaxpress already supports
GenDC in its latest version meaning the standard now has full 3D support; GigE Vision version 2.2 will include it too
when it is released at the end of the year. In 2018, when the current
version of GigE Vision, 2.1, was introduced, a multi-part payload was added. Tis allows the user to put different data types into one single logical frame thus enabling the transport of 3D data as part of a single container. Te way the transport layer
works in GigE Vision with a typical 2D image, explained James Falconer, product manager at Pleora Technologies and vice chair of the GigE Vision technical group, is to first send a leader, which describes the incoming image; then the image data; and then a trailer that signifies the end of that frame. Multi-part means
Commercial products
New GigE Vision camera models on the market include updated versions of Teledyne Flir’s Blackfly S line and Lucid Vision Labs’ Atlas line, which has just entered production. The additions to the Blackfly S GigE camera line – the BFS-PGE- 50S4M-C and BFS- PGE-50S4C-C – are 5-megapixel models suitable for integration
into handheld devices. The cameras use Sony’s IMX547 global shutter sensor; offer lossless compression to move from 24fps to 30fps at full resolution; have 68 per cent quantum efficiency at 525nm; and 2.49e- read noise. The Atlas IP67-rated
camera line from Lucid Vision Labs has a 5 GigE PoE interface and features Sony Pregius global and rolling shutter CMOS sensors. The first models available include the 7.1-megapixel Sony IMX420 sensor running at 74.6fps, and the
20-megapixel IMX183 sensor running at 17.8fps. The cameras feature
active sensor alignment for excellent optical performance; M12 Ethernet and M8 general purpose I/O connectors that are resistant to shock and vibration; and industrial EMC immunity. They operate over -20°C to 55°C and measure 60 x 60mm. The 5GBase-T Atlas is a GigE Vision- and GenICam- compliant camera capable of 600MB/s per second data transfer rates (5Gb/s)
over CAT5e and CAT6 cables up to 100 metres in length. In addition, Kithara
has released a real- time driver for the frame grabber card PGC-1000 by PLC2 Design. The PCIe card acquires and converts GigE Vision data, for which Kithara’s RealTime software suite provides real-time
functionality. The PCIe plug-in
card handles the entire conversion process of captured GigE Vision data, thus relieving the CPU during image acquisition. In this way, multiple camera streams (4 x 10Gb/s or 1 x 40Gb/s) can be operated at minimal CPU load, and the captured image data can be stored on SSDs within the same real-time context. Additionally, real-time synchronisation of multiple cameras is achievable with the Kithara PTP feature.
different data types can be put in one container. Tat was the introduction of 3D into GigE Vision. Te GigE Vision standard was
initially released in 2006, and, as part of that effort, the GenICam technical group formed alongside the GigE Vision group to help with interoperability. ‘Typically, 3D technologies
up until recently have had a proprietary transport mechanism, even if they are based on Ethernet,’ Falconer
said. ‘More recently we’ve seen a large push, especially in Asia, to move that to a standards- based technology with GigE Vision as the underlying transport layer.’ Originally there wasn’t a
defined way on how to send 3D data, Falconer explained. ‘Before the introduction of GigE Vision 2.1 there were cameras that would use GigE Vision to transmit 3D data, but they used to take an RGB pixel data and use that as a container to
24 IMAGING AND MACHINE VISION EUROPE OCTOBER/NOVEMBER 2021
@imveurope |
www.imveurope.com
asharkyu/
Shutterstock.com
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32