Abstract: The last years have witnessed an increasing attention to entropy based criteria in adaptive systems. Several principles were proposed based on the maximization or minimization of entropic cost functions. One way of entropy criteria in learning systems is to minimize the entropy of the error between two variables: typically one is the output of the learning system and the other is the target. In this paper, a classification of multilayer Back Propagation (BP) Neural Networks was proposed. The usual mean square error(MSE) minimization principle is substituted by the minimization of cross-entropy (CE) of the differences between the multilayer perceptions output and the desired target. These two cost functions are studied, analysied and tested with three different activation functions namely, the trigonometric (Sin) function, the hyperbolic tangent function, and the sigmoid activation function. The analytical approach indicates that the results are encourage and promising and that the cross entropy cost function is a more appropriate error function than the usual mean square error.
(2008). Classification of Multilayer Neural Networks Using Cross Entropy and Mean Square Errors. Journal of the ACS Advances in Computer Science, 2(1), 29-48. doi: 10.21608/asc.2008.148466
MLA
. "Classification of Multilayer Neural Networks Using Cross Entropy and Mean Square Errors", Journal of the ACS Advances in Computer Science, 2, 1, 2008, 29-48. doi: 10.21608/asc.2008.148466
HARVARD
(2008). 'Classification of Multilayer Neural Networks Using Cross Entropy and Mean Square Errors', Journal of the ACS Advances in Computer Science, 2(1), pp. 29-48. doi: 10.21608/asc.2008.148466
VANCOUVER
Classification of Multilayer Neural Networks Using Cross Entropy and Mean Square Errors. Journal of the ACS Advances in Computer Science, 2008; 2(1): 29-48. doi: 10.21608/asc.2008.148466