search.noResults

search.searching

saml.title
dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
Data acquisition


Making Big Data UsaBle


Dr Yanfeng Liang, mathematician at TÜV SÜD National Engineering Laboratory, explains why well maintained and structured data systems are key to getting the most of data.


outputting at a high frequency. This output contains valuable information on performance and operating conditions, however, simply storing this data is not enough to maximise its usefulness. Data is only useful if appropriate modelling strategies are used to extract the underlying values and obtain new insights. To maximise valuable diagnostic information,


I


advanced modelling techniques such as machine learning models must be used. Such models use a data analytics algorithm that teaches computers to perform tasks by learning from experience. Effectively there are two categories, ‘supervised learning’ or ‘unsupervised learning’, each have their advantages and disadvantages. The deciding factor on which model to use is highly dependent on the type of data and question that is being addressed. A more detailed comparison between a supervised model and an unsupervised model is given in the Table. Ultrasonic flow meters (USMs) can provide


information about meter health and process conditions, but different process conditions can create ambiguity for an end-user tasked with interpreting meter data. Machine learning models can be used to overcome this, as well as to extract the most important variables in determining a specific operating condition. This reduces the risk


n flow measurement, vast amounts of data are generated and stored by flow meters which have built-in digital transmitters


of including redundant and misleading data. To enhance our understanding, research was


conducted where varying percentages of gas were deliberately injected into fluid to investigate the subsequent effects on meter performance, and a supervised machine learning model was built based on the experimental data. While different USMs were tested, one


example takes experimental data from a USM, which consisted of 55 variables and 11,081 observations, where the data was further divided into training data, validation data and unseen data. The training data was used to teach the model to learn the expected patterns and interrelationships in variables when exposed to different percentages of gas. The model’s prediction capability was then tested using the validation data. The supervised machine learning model


achieved an accuracy of 98.62 per cent in correctly classifying the data with respect to the correct operating condition (i.e. the percentage of gas present within the USM under test). Groups of unseen data, not used during the


training and the validation stages of the model development, were then used to further test the model’s predictive capability. For example, for some unseen data, the model predicted 99.94 per cent of those data correctly represented the condition where the fluid had 0 % GVF (gas volume fraction) present, with the remaining 0.6 per cent attributed incorrectly to other GVF groups. In other words,


SUPERVISED VS. UNSUPERVISED MACHINE LEARNING MODELS UNSUPERVISED MACHINE LEARNING MODEL Unlabelled data (more common to find in practice)


Has input variables but no response (output) variables. Do not have predefined conditions or classes.


Separate data into groups based on how similar or different the


data points are. Can identify any underlying pattern. Only use input variables.


More suitable for real time monitoring due to its unsupervised nature.


Model is more complex with high computational cost. Less interpretable and less accurate due to no response variables.


Example of unsupervised machine learning model: clustering model, gaussian mixture model-based model and anomaly detection model.


SUPERVISED MACHINE LEARNING MODEL Labelled data (less common to find in practice)


Has input variables and output variables. Has predefined conditions or classes.


Learn the pattern and correlations using training data. Identify


connections between input variables and output variables. Variable selections are available.


Usually take place offline. Model is simpler with low computational cost. Very interpretable, reliable and accurate.


Example of supervised machine learning model: tree-based model, neural network, regression model and support vector machine.


the model had predicted with a high accuracy that the unseen data was collected when the USM had no gas present within the fluid and therefore it is highly probable that the flow meter does not contain an unwanted second-phase based on the patterns observed by the model. The high accuracy of the model in


determining between different GVF groups, by finding hidden patterns and correlations, is promising; results such as these would be beneficial to end-users who wish to identify how much gas is present within the fluid based on drifts experienced in certain variables. From an end-user perspective, having the ability to predict the percentage of gas present within the fluid and therefore the degree of effect on the performance of USMs can aid in decision-making and maintenance processes. The data used contained 55 different variables


output from the USM, and each variable has multiple interrelationships with the others. Some of these variables are crucial in determining the health and performance of a flow meter, whilst others could be misleading and redundant in identifying the presence of gas within the system.


52


September 2021 Instrumentation Monthly


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68  |  Page 69  |  Page 70  |  Page 71  |  Page 72  |  Page 73  |  Page 74