Classification of Multilayer Neural Networks Using Cross Entropy and Mean Square Errors

Document Type : Original Article

Abstract

Abstract: The last years have witnessed an increasing attention to entropy based criteria in
adaptive systems. Several principles were proposed based on the maximization or
minimization of entropic cost functions. One way of entropy criteria in learning systems is
to minimize the entropy of the error between two variables: typically one is the output of
the learning system and the other is the target. In this paper, a classification of multilayer
Back Propagation (BP) Neural Networks was proposed. The usual mean square
error(MSE) minimization principle is substituted by the minimization of cross-entropy
(CE) of the differences between the multilayer perceptions output and the desired target.
These two cost functions are studied, analysied and tested with three different activation
functions namely, the trigonometric (Sin) function, the hyperbolic tangent function, and the
sigmoid activation function. The analytical approach indicates that the results are
encourage and promising and that the cross entropy cost function is a more appropriate
error function than the usual mean square error.

Keywords

Main Subjects