Feature: Machine vision
take on more demanding tasks. A robot can now learn from a human operator who guides their arms and tool selections through the various steps to complete a task, thanks to machine-learning techniques. Once guided, machine learning can optimise the movements for greater efficiency. As robots become even more intelligent thanks to vision-processing algorithms, they will manage themselves based on watching a human undertake tasks. As robots become more capable and environment-aware, they
will move away from behind protective screens to be physically closer to human workers. Collaboration can be a complex set of actions that we instinctively pick up very quickly when working alongside another person. We rely on visual clues, gestures and voice, which a robot will need to learn as an indication of intent. Consequently, these collaborative robots, or “cobots”, require significantly more sensing capabilities, in addition to being able to connect to and control other items of production machinery, to be effective and productive.
Processing technologies Vision is one of the essential attributes of an industrial robot. Te machine-vision technology it is based on has been used in industrial automation for decades for various tasks, including pattern recognition, image matching and image processing. Today’s cameras, however, can collect greater volumes of data than before: they measure dimensions of objects within an image frame, indicate object temperature, distinguish colours, and so on. Deep-learning neural networks, such as convolutional neural networks (CNNs) that perform image classification and inspection tasks, are becoming the norm for vision-processing tasks. Pre-trained neural networks can recognise and identify objects exceptionally quickly, and as camera specifications develop further and neural network algorithms become more sophisticated error rates continue to decrease. But there are some key challenges associated with AOI
Commonplace Robots are commonplace across many of today’s factories and production facilities. According to the most recent research conducted by the International Federation of Robotics, over the next two years there will be another two million industrial robots installed in factories around the world. Tis growth is driven by two important trends: robots are becoming smarter and are increasingly deployed to work alongside human operators. Tanks to improved sensing capabilities and AI, intelligent
systems can now learn by example, moving the scope of robots beyond the repetitive, manually-intensive and labour-saving tasks they were originally designed to carry out. As production engineering managers become more familiar with the capabilities of industrial robots, they see further opportunities for robots to
systems, including the need for processing power beyond that offered by ordinary microprocessors. The main reason for this are algorithm compute requirements associated with production lines moving at speed. Traditionally, this demand for high levels of computing capacity and processing bandwidth has been met by graphic processing units (GPUs) or field-programmable gate arrays (FPGAs). However, robotic application demands will continue to push the complexity of these devices and their power consumption. Cloud-based data centres, where real-time on-demand computing
capacity is infinitely scaleable, may offer an alternative to GPUs and FPGAs. But the technical challenges that come with long latency times, varying bandwidth and lack of a deterministic behaviour shouldn’t be underestimated. Cybersecurity is also a concern, as the entire industrial automation infrastructure may, at some point, become exposed to threats and adversaries. To overcome all these challenges, including an increasing demand
for low-latency and high-bandwidth neural-network processing, semiconductor vendors have developed optimised compute solutions for inference (for teaching neural networks). Inference, the term used
www.electronicsworld.co.uk November/December 2020 51
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52 |
Page 53 |
Page 54 |
Page 55 |
Page 56 |
Page 57 |
Page 58 |
Page 59 |
Page 60 |
Page 61 |
Page 62 |
Page 63 |
Page 64 |
Page 65 |
Page 66 |
Page 67 |
Page 68