Slow convergence and long training times are still the disadvantages often mentioned when neural networks are compared with other competing techniques. One of the reasons of slow convergence in Backpropagation learning is the diminishing value of the derivative of the commonly used activation functions as the nodes approach extreme values, namely, 0 or 1. In this paper, we propose eight activation functions to accelerate learning speed by eliminating the number of iterations and increasing the convergence rate. Mathematical proving of the errors for the output and hidden layers using these activation functions are concluded. Statistical measures are also obtained using these different activation functions. Through the simulated results, these activation functions are analyzed, compared and tested. The analytical approach indicates considerable improvement in training times and convergence performance.
(2009). A Comparative Approach to Accelerate Backpropagation Neural Network Learning using different Activation Functions. Journal of the ACS Advances in Computer Science, 3(1), 83-102. doi: 10.21608/asc.2009.158226
MLA
. "A Comparative Approach to Accelerate Backpropagation Neural Network Learning using different Activation Functions". Journal of the ACS Advances in Computer Science, 3, 1, 2009, 83-102. doi: 10.21608/asc.2009.158226
HARVARD
(2009). 'A Comparative Approach to Accelerate Backpropagation Neural Network Learning using different Activation Functions', Journal of the ACS Advances in Computer Science, 3(1), pp. 83-102. doi: 10.21608/asc.2009.158226
VANCOUVER
A Comparative Approach to Accelerate Backpropagation Neural Network Learning using different Activation Functions. Journal of the ACS Advances in Computer Science, 2009; 3(1): 83-102. doi: 10.21608/asc.2009.158226