This page contains a Flash digital edition of a book.
In-depth | LIGHTSHIP WEIGHT


Fig. 3. Comparison of actual and ANN output achieved after 4 trainings.


the learning rule, for optimum search for the global minimum error, which can speed up and stabilise convergence during the training process. Tis ANN structure is advantageous due to its ease of use, and ability of approximating any input - output map.


Results of the ANN Model The ranges of all the variables of the experimental data are shown in Table 1. For developing the ANN model for lightship weight prediction the ANN softwares “Neurosolutions’’ & “Neurosolutions for Excel’’, [7] were used. Te training of the ANN was terminated


at the 150th epoch. Out of 56 container ships gathered, forty-five where used to train the neural network and the remaining eleven were used as cross validation data. During cross validation the network’s response was tested in terms of how well it was trained on the training data. By using four PEs in the hidden layer,


Fig. 4. Correlation of experimental and predicted results of ANN model.


nature to be extracted and then they result in the output. If input - output data has considerable error, then these errors are being fed back (error back-propagation) to the input layer so that the connections between the variables can be adjusted again. Hence, by using the gradient descent technique for the errors, the accuracy of the lightship weight approximation is improved. This cycle is called an epoch and several epochs are usually needed to train neural networks depending on their size and number of data entered. Te number of PEs in the input layer is equal to the number of input variables (16),


24


and the number of PEs in the output layer is 1, which returns the corresponding lightship weight, Wlightship


. Te number of PEs in the


hidden layer is a key parameter to determine in the development of ANN models, [6]. It has a direct effect on the model quality in terms of accuracy. Tere is no general rule to determine this number hence, a trial-and- error process was used. Te number of PEs in the hidden layer was set to four and the use of a genetic algorithm (GA) option was selected to consult for the optimum number. Te network was trained by using static


back-propagation, being the learning paradigm, and momentum learning, being


(Fig. 2), the errors for this model in the mean square error (MSE) term resulted 0.0058 in training and 0.00612 in cross-validation, both at the 150th iteration and both of which are small. Based on these errors, Neurosolutions provided the best loaded coefficients achieved for the matrix synaptic connections and are to be appointed to each of the 16 input variables. In addition, the linear correlation is 0.935, which is fairly close to 1. Tese indicate that the developed ANN model is quite accurate. Fig. 2 also shows the structure representation of this ANN model. A useful observation that can be drawn


off Fig. 4 scatter diagram is that the ANN predicted quite successfully the correct lightship weight values for container ships between 95m< _Lbp< _195m and 285m< _Lbp< _330m . Uncertainty analysis calculations were


performed according to ASME Standards PTC 19.1 – 1998, [8]. The calculations were performed using the sensitivity results for the 16 input parameters of the basic database which used to train the network. A 32% uncertainty level was obtained which means that the ANN model seems to have confidence of 78%. Te functional relationship resulted by the


trained network by no means is similar to the The Naval Architect November 2009


Page 1  |  Page 2  |  Page 3  |  Page 4  |  Page 5  |  Page 6  |  Page 7  |  Page 8  |  Page 9  |  Page 10  |  Page 11  |  Page 12  |  Page 13  |  Page 14  |  Page 15  |  Page 16  |  Page 17  |  Page 18  |  Page 19  |  Page 20  |  Page 21  |  Page 22  |  Page 23  |  Page 24  |  Page 25  |  Page 26  |  Page 27  |  Page 28  |  Page 29  |  Page 30  |  Page 31  |  Page 32  |  Page 33  |  Page 34  |  Page 35  |  Page 36  |  Page 37  |  Page 38  |  Page 39  |  Page 40  |  Page 41  |  Page 42  |  Page 43  |  Page 44  |  Page 45  |  Page 46  |  Page 47  |  Page 48  |  Page 49  |  Page 50  |  Page 51  |  Page 52  |  Page 53  |  Page 54  |  Page 55  |  Page 56  |  Page 57  |  Page 58  |  Page 59  |  Page 60  |  Page 61  |  Page 62  |  Page 63  |  Page 64  |  Page 65  |  Page 66  |  Page 67  |  Page 68