This paper investigates a new learning algorithm (LFI) based on Lyapunov function for the training of feedforward neural networks. The proposed algorithm has an interesting parallel with the popular back-propagation algorithm where the fixed learning rate of the back-propgation algorithm is replaced by an adaptive learning rate computed using convergence theorem based on Lyapunov stability theory. Next, the proposed algorithm is modified (LF II) to allow smooth search in the weight space. The performance of the proposed algorithms is compared with back-propagation algorithm and extended Kalman filtering(EKF) on two bench-mark function approximations, XOR and 3-bit Parity. The comparisons are made in terms of learning iterations and computational time required for convergence. It is found that the proposed alogorithms (LF I and II) are faster in convergence than other two algorithms to attain same accuracy. Finally the comparison is made on a system identification problem where it is shown that the proposed algorithms can achieve better function approximation accuracy.