search.noResults

search.searching

note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
News


Consumer electronics boosts Q3 revenue for Cognex


Large orders in the consumer electronics industry and strong performance in the automotive and logistics sectors have led to machine vision firm Cognex experiencing record results in the third quarter of 2017, ending 1 October. Revenue of $260 million and a net income of $102.3 million was generated, an increase of 76 per cent and 91 per cent year-on-year respectively, and increases of 50 per cent and 83 per cent compared to the results of the second quarter. As of 1 October, the firm’s financial position is particularly strong, with $771 million in cash and investments – up $26 million from the end of 2016 – and no debt. The firm’s research, development and engineering (RD&E) expenses increased 40 per cent year-on-year thanks to additional engineering resources – including employees added from recent acquisitions – as it continued to invest in both current and new products. Inventories therefore increased by 78 per cent ($21 million) from the end of 2016 to support new product introductions, in addition to Cognex’s substantially higher level of business. Revenue for the fourth quarter is expected to be between $170 million and $180 million, with $52 million of unbilled revenue from 1 October to be invoiced largely throughout the period.


Researchers use eye-inspired camera to guide drones


R


esearchers at the University of Zurich (UZH) and the Swiss research consortium NCCR Robotics are using


eye-inspired cameras to guide drones in low- light conditions. Te work could lead to drones being able to


perform fast, agile manoeuvres in challenging environments and applications, such as search and rescue missions in urban areas at dusk or dawn. Te vision system used is a prototype event


camera, a type of camera using a bio-inspired retina to capture clear pictures without needing a full amount of light across the entire sensor (Matthew Dale writes about event cameras on page 14). Unlike their standard counterparts, event cameras only report changes in brightness for each pixel, rather than colour as well, enabling them to capture sharp images even during fast motion or in low-light environments. Drones equipped with an event camera and


the researcher’s soſtware, could assist in search and rescue scenarios in low light conditions where normal cameras would be ineffective. Tey would also be able to fly faster in disaster areas, where time is critical in saving survivors. ‘Tis research is the first of its kind in the


fields of artificial intelligence and robotics, and will soon enable drones to fly autonomously and faster than ever,’ said Professor Davide


Scaramuzza, director of the Robotics and Perception Group at UZH. Scaramuzza and his team have already


taught drones to use onboard event cameras to infer their position and orientation in space, allowing them to fly safely. Tis is particularly significant, as drones


usually require GPS in order to accomplish this, which can oſten be unreliable and only works outdoors. Tere is still work to be done before these


drones can be deployed in the real world, as the event camera is still an early prototype.


MIPI Alliance releases camera command set specification


Te MIPI Alliance has released a specification providing a standardised way to integrate image sensors in mobile devices. Te new specification, MIPI


Camera Command Set v1.0 (MIPI CCS v1.0), defines a standard set of functionalities for implementing and controlling image sensors. Te specification is offered


for use with MIPI Camera Serial Interface 2 v2.0 (MIPI CSI-2 v2.0) and is now available for download. In an effort to help standardise use of MIPI CSI-2, MIPI Alliance


membership is not required to access the specification. MIPI CSI-2 is being considered


by the Future Standards Forum of the G3 vision group as a potential machine vision standard to cater for embedded cameras. Industrial vision companies are already releasing products with the interface – such as Allied Vision’s Alvium ASIC – in its 1 product line of cameras, and Basler’s Dart cameras. MIPI CSI-2 is a widely used hardware interface for deploying


6 Imaging and Machine Vision Europe • December 2017/January 2018


camera and imaging components in mobile devices. Te introduction of MIPI CCS to MIPI CSI-2 further increases interoperability, and reduces integration and costs for complex imaging and vision systems, say the MIPI Alliance. MIPI CCS offers a common


soſtware driver to configure the basic functionalities of any off-the- shelf image sensor compliant with MIPI CCS and MIPI CSI-2 v2.0. Te new specification provides


a complete command set that can integrate basic image sensor


features such as resolution, frame rate and exposure time, as well as advanced features like phase detection auto focus (PDAF), single frame HDR or fast bracketing. ‘Te overall advantage of


MIPI CCS is that it will enable rapid integration of basic camera functionalities in plug-and-play fashion, without requiring any device-specific drivers, which has been a significant pain point for developers,’ said Mikko Muukki, technical lead for MIPI CCS.


@imveurope www.imveurope.com


UZH


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44