FEATURE SENSORS & SENSING SYSTEMS WIRED OR WIRELESS?
which is the best technology for sensor deployment?
Paul O’Shaughnessy, channel sales manager, Advantech IIOT, examines
whether wired or wireless is the best technology for sensor development
T
aking measurements from real world processes has always been fundamental in
data acquisition systems. Some would even argue that defining the points to be measured was the starting point from which systems were defined, and ultimately dictated what the deployed systems could and could not do. This is no less true as IoT technologies are
adopted in the transition to Industry 4.0. In fact, the adoption of these new technologies increases the demand for more and more measurement data, as users include an ever-increasing number of factors into the optimisation of their processes and equipment. However, the need to add more data
brings challenges. The inherent flexibility of IoT systems means that the required data measurements evolve over time, requiring strategies to integrate pre-existing measurement points with new ones, which can often be physically removed from any pre-existing wiring or communication links. In such cases, the cost of either adding the new
measurement points, or the cost of recovering the data from these, can often be the deciding factor in whether a desired system enhancement will make economic sense. It is therefore understandable that engineers are interested in the deployment of wireless measurement systems, as these can often be installed much more quickly and cost effectively than those involving the laying and routing of additional wired infrastructure. Deploying wireless sensing technology involves compromises, however, and it is the effect of these compromises that ultimately determines if a wired or wireless architecture is selected and, if the latter, which radio technology is most suitable.
THE CASE FOR WIRED INFRASTRUCTURE Traditionally, sensors have been connected to data acquisition systems via wired interfaces, typically 4-20mA current loops, but also in the case of more specialised measurements via
16 MAY 2021 | DESIGN SOLUTIONS
higher speed voltage-based systems. These connections can carry power to the sensors, are reliable, accurate, offer faster measurement transmission and update times than wireless systems and, if correctly installed, are both highly secure and insensitive to noise and other interference. But, the cost of installation can be very high, often involving alterations to buildings or digging trenches in which to lay the cables, and cabling for a sensor in a remote place may not even be physically practical. In addition, it’s usually not possible to use wired interfaces to mobile equipment, unless the range of movement is limited and can be accurately forecast.
THE EVOLUTION OF REMOTE I/O Telemetry has been used for many years to alleviate some of these limitations. By moving the physical sensor interface closer to the sensor, digitising its data at that point, and then sending the digitised data onwards via a serial or network data connection, we can significantly reduce the cost of installation whilst also improving noise susceptibility, especially where multiple sensor interfaces exist at a single point. It also becomes possible to use radio to transmit the digitised data, extending the reach of data acquisition systems to geographically remote sites. The definition of ‘remote I/O’ is very broad. For the purposes of this discussion, we will
apply it to any device which interfaces to sensors and digitises the resulting measurements before onwards transmission. This means we encompass traditional Remote Telemetry Unit (RTU) and remote I/O devices using protocols such as Modbus, but also USB devices aimed at lab measurements and fieldbus-based data acquisition nodes, as well as more modern incarnations, such as IoT edge and gateway devices. The use of any such device starts to introduce compromises that must be considered: • Digitising data implicitly places a constraint on the precision that can be recovered. Does the remote I/O device convert the data with enough resolution (i.e. number of bits) to be able to extract the nuances in the measurement required by any upstream analytics?
• Is the quality of the measurement (accuracy, repeatability, noise immunity etc.) sufficiently good?
• Transmitting data either serially or via a network takes a finite amount of time. This time is affected by the amount of data to be transmitted, the amount of protocol overhead needed to encapsulate the data, the speed of the communications media being used, and the number of other devices sharing the same connection. Can the data be recovered fast and sufficiently often enough to provide the necessary insights and avoid the
/ DESIGNSOLUTIONS
Page 1 |
Page 2 |
Page 3 |
Page 4 |
Page 5 |
Page 6 |
Page 7 |
Page 8 |
Page 9 |
Page 10 |
Page 11 |
Page 12 |
Page 13 |
Page 14 |
Page 15 |
Page 16 |
Page 17 |
Page 18 |
Page 19 |
Page 20 |
Page 21 |
Page 22 |
Page 23 |
Page 24 |
Page 25 |
Page 26 |
Page 27 |
Page 28 |
Page 29 |
Page 30 |
Page 31 |
Page 32 |
Page 33 |
Page 34 |
Page 35 |
Page 36 |
Page 37 |
Page 38 |
Page 39 |
Page 40 |
Page 41 |
Page 42 |
Page 43 |
Page 44 |
Page 45 |
Page 46 |
Page 47 |
Page 48 |
Page 49 |
Page 50 |
Page 51 |
Page 52