TECH FOCUS: GIGE VISION Euresys Featured product
stuff the XYZ components (as 3D data) in R, G and B parts respectively, and then they’d reconstruct the 3D data on the host side. Tat made it still a closed system, a proprietary system,’ he said. Now, with the multi-part
payload in GigE Vision 2.1, and with GenDC coming as an alternative means for 3D transport in version 2.2, there are now more possibilities for 3D and other types of data to be transported on various standards. Specifically, GenDC extends beyond 3D and solves other use cases as well, such as new compression types. ‘Te benefit for the user
goes back to interoperability,’ Falconer said. ‘If the user wants to use image processing software from one vendor, and that vendor is standards- compliant and supports GenDC or multi-part, then they can use a GigE Vision 2.1- or 2.2-compliant camera from another vendor and still use the software.’ Tere’s now a desire to move
to a standards-based Ethernet transport, Falconer said, but many 3D camera technologies are proprietary Ethernet-based. Pleora has released eBus Edge as part of its eBus SDK. EBus Edge is a set of libraries that allows the user to convert a proprietary 3D camera into a standards-based transport mechanism.
‘We’ve seen a large push, especially in Asia, to move [3D data transport] to a standards-based technology with GigE Vision’
www.imveurope.com | @imveurope
Te user can write an
application for transport over multi-part with eBus Edge to give 3D transport capability. ‘Our libraries also allow you to customise the GenICam layer to add your own 3D-specific features and other features required to control your device or sensors,’ Falconer said. Pleora’s eBus SDK gives
a sample application that shows the user how to handle a stereoscopic 3D use case, for example, from both the GenICam and transport layers. In embedded vision, for
example, a user might want to connect one or multiple Mipi sensors to an edge device. ‘EBus Edge supports multiple sources, so you can, for example, connect two Mipi sensors to a compute device and then, on the Ethernet transport side, you have two different logical streams of 2D GigE Vision data on the same Ethernet link,’ Falconer explained. ‘Alternatively, you can connect two Mipi sensors, gather the data in memory, do some processing to compute your 3D point cloud, and with our libraries take that data and transmit it using the multi-part payload giving you a 3D GigE Vision-compliant data stream to connect to software from any GigE Vision-compliant software vendor.’ EBus Edge transmits
uncompressed 3D data using multi-part payloads with low, predictable latency over Ethernet directly to existing ports on a computer for analysis. Alongside GenDC
streaming, GigE Vision 2.2 will have multiple event data functionality. Up until GigE Vision 2.2, each asynchronous event message from the device to the host PC was sent separately. Version 2.2 allows the user to concatenate multiple
Euresys/Sensor to Image IP Cores speed up the implementation of the GigE Vision protocol into any embedded project. From the Top Level Design (which interfaces the imager, sensors, the GigE PHY to the FPGA internal data processing) to Working Reference Designs as well as hardware and software toolkits, Sensor to Image IP Cores deliver top-notch performance in a small footprint while leaving enough flexibility to the developer to customize his design. The GigE Vision IP Cores are compatible with Xilinx 7 Series devices (and higher) and Intel/Altera Cyclone V devices (and higher). The toolkits exist in various configurations and come with sample applications and examples. Sensor to Image’s offering also includes USB3 Vision,
CoaXPress and CoaXPress-over-Fiber interface IP Cores as well as Image Sensor IP Cores for MIPI CSI-2 and Sony’s IMX Pregius sensors.
www.euresys.com/en/Products/IP-Cores/Vision- Standard-IP-Cores-for-FPGA/GigE-Vision-IP-Core-(1)
event messages into a single message. Going beyond version 2.2,
Falconer said that the GigE Vision technical committee’s next big task is to look at speeds greater than 10Gb/s. GigE Vision guarantees data
delivery through a proprietary mechanism called packet resend. Te transmitter must store the data while the receiver is processing packets, which allows a receiver to request retransmission of packets. Te amount of data that must be stored for the packet resends
increases linearly with link speed. Ethernet equipment can ask
a transmitter to temporarily stop streaming. Te transmitter must therefore be able to buffer data to cope with pauses on the Ethernet link. During an update given at the Vision trade fair in October, Falconer said the GigE Vision committee is looking to improve the standard to reduce the memory requirement for devices, with the aim of enhancing the robustness and stability of transferring data at speeds higher than 10Gb/s. O
OCTOBER/NOVEMBER 2021 IMAGING AND MACHINE VISION EUROPE 25
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32