search.noResults

search.searching

dataCollection.invalidEmail
note.createNoteMessage

search.noResults

search.searching

orderForm.title

orderForm.productCode
orderForm.description
orderForm.quantity
orderForm.itemPrice
orderForm.price
orderForm.totalPrice
orderForm.deliveryDetails.billingAddress
orderForm.deliveryDetails.deliveryAddress
orderForm.noItems
TECH FOCUS: DEEP LEARNING


Lessons in training neural nets


Limited data is a common problem when training CNNs in industrial imaging applications. Petra Tanner andDaniel Soukup, from the Austrian Institute of Technology, discuss ways of working with CNNs when data is scarce


D


eep learning – or neural networks – has had a triumphal march over the last decade. Since sufficient


computing power has become available with GPUs, neural networks have been trained to handle many tasks in various areas, from language processing in text and sound, to image processing, classification, and anomaly detection. Numerous publications are reporting successful adoption of deep learning for various datasets and tasks. Today’s widespread opinion about deep learning is that the only necessary ingredients are a data sample and a task. Aſter that, the neural network is able to identify the relevant patterns in the data all by itself and give highly sophisticated and reliable predictions on complex data problems. Unfortunately, this is not always correct.


Industrial inspection In industrial inspection, image processing has always been a core discipline. To solve inspection problems during the pre-deep learning era, machine vision engineers oſten analysed only a handful of image samples and developed elaborate mathematical models that approximately described the visible, relevant geometrical or statistical structures. Tis was a painstaking procedure and had to be repeated each time new insights emerged. Today’s convolutional neural networks (CNN) promise to make this process much easier and, above all, more cost-effective, since all relevant information is automatically extracted from sample images. And the CNN training procedure can easily be re-run, whenever new and extended training data are available.


Te only necessary component is computing power. However, there are also pitfalls with drawing


all information only from training samples. Characteristic appearance variations not covered in the training data will probably not be recognised correctly in the CNN’s inference phase. Terefore, training data for neural networks must contain samples covering as many variations in appearance as possible. While CNNs are indeed remarkably capable


of generalising the presented training samples to a certain extent, they also might over-fit on scarce training data, and so jeopardise their prediction performance on unseen data. In addition, object classes that are missing


28 Imaging and Machine Vision Europe • August/September 2019


entirely cannot usually be handled properly. Only in special settings, or when special precautions have been taken, will CNNs exhibit some awareness about novelty in the data. Tis means being able to identify anomalous data with regards to what has been presented during the training phase. Te academic community has compiled


multiple thorough and balanced sample datasets, based on which the potential of new deep learning algorithms are explored, tested and published. For industrial applications, large databases – of images of production defects, for example – are not oſten readily available. Unbalanced and insufficient data contain a bias, so that the CNN over-fits on


@imveurope www.imveurope.com


Yurchanka Siarhe/Shutterstock.comi


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40