On adaptive learning rate that guarantees convergence in feedforward networks

Laxmidhar Behera, Swagat Kumar, Awhan Patnaik

Research output: Contribution to journalArticle

107 Citations (Scopus)

Abstract

This paper investigates new learning algorithms (LF I and LF II) based on Lyapunov function for the training of feedforward neural networks. It is observed that such algorithms have interesting parallel with the popular backpropagation (BP) algorithm where the fixed learning rate is replaced by an adaptive learning rate computed using convergence theorem based on Lyapunov stability theory. LF II, a modified version of LF I, has been introduced with an aim to avoid local minima. This modification also helps in improving the convergence speed in some cases. Conditions for achieving global minimum for these kind of algorithms have been studied in detail. The performances of the proposed algorithms are compared with BP algorithm and extended Kalman filtering (EKF) on three bench-mark function approximation problems: XOR, 3-bit parity, and 8-3 encoder. The comparisons are made in terms of number of learning iterations and computational time required for convergence. It is found that the proposed algorithms (LF I and II) are much faster in convergence than other two algorithms to attain same accuracy. Finally, the comparison is made on a complex two-dimensional (2-D) Gabor function and effect of adaptive learning rate for faster convergence is verified. In a nutshell, the investigations made in this paper help us better understand the learning procedure of feedforward neural networks in terms of adaptive learning rate, convergence speed, and local minima. © 2006 IEEE.
Original languageEnglish
Pages (from-to)1116-1125
Number of pages10
JournalIEEE Transactions on Neural Networks
Volume17
Issue number5
DOIs
Publication statusPublished - 30 Sep 2006

Keywords

  • Adaptive learning rate
  • Backpropagation (BP)
  • Extended Kalman filtering (EKF)
  • Feedforward networks
  • Lyapunov function
  • Lyapunov stability theory
  • System-identification

Fingerprint Dive into the research topics of 'On adaptive learning rate that guarantees convergence in feedforward networks'. Together they form a unique fingerprint.

  • Cite this