Conjugate Gradient Back-propagation with Modified Polack –Rebier updates for training feed forward neural network

Section: Article
Published
Dec 1, 2011
Pages
164-173

Abstract

Several learning algorithms for feed-forward (FFN) neural networks have been developed, many of these algorithms are based on the gradient descent algorithm well-known in optimization theory which have poor performance in practical applications. In this paper we modify the Polak-Ribier conjugate gradient method to train feed forward neural network. Our modification is based on the secant equation (Quasi-Newton condition). The suggested algorithm is tested on some well known test problems and compared with other algorithms in this field.

Download this PDF file

Statistics

How to Cite

Al-Bayati, A., A. Saleh, I., & K. Abbo, K. (2011). Conjugate Gradient Back-propagation with Modified Polack –Rebier updates for training feed forward neural network. IRAQI JOURNAL OF STATISTICAL SCIENCES, 11(2), 164–173. https://doi.org/10.33899/iqjoss.2011.027897