A Comparative Approach to Accelerate Backpropagation Neural Network Learning using different Activation Functions

Document Type : Original Article

Abstract

Slow convergence and long training times are still the disadvantages
often mentioned when neural networks are compared with other competing
techniques. One of the reasons of slow convergence in Backpropagation learning is
the diminishing value of the derivative of the commonly used activation functions
as the nodes approach extreme values, namely, 0 or 1. In this paper, we propose
eight activation functions to accelerate learning speed by eliminating the number
of iterations and increasing the convergence rate. Mathematical proving of the
errors for the output and hidden layers using these activation functions are
concluded. Statistical measures are also obtained using these different activation
functions. Through the simulated results, these activation functions are analyzed,
compared and tested. The analytical approach indicates considerable improvement
in training times and convergence performance.

Keywords