RBF Networks Trainer

Download  (1028k) - RBF_Trainer  written in C++, which works in Windows (unpack to a directory  and run RBF_Trainer.exe)

Download  (1064k)  instruction fro RBF_Trainer 

 

NBN - Neuron by Neuron Trainer

Neural network training software for networks with arbitrarily connected neurons

You are free to download and to use the software:

(1) Download  (1470k) - NBN-2.08  written in C++, which works in Windows (unpack to a directory  and run NBN 2.08.exe)

  

The following algorithms are currently implemented in the C++ version (1):

EBP: This is EBP algorithm with traditional forward-backward computation; for EBP algorithm, it may work a little bit faster than forward-only computation. Now it is only used for standard MLP networks. EBP algorithm converges slowly, but it can be used for huge patterns training.

LM: This is LM algorithm with traditional forward-backward computation; for LM (and NBN) algorithm, the improved forward-only computation performs faster training than forward-backward computation for networks with multiple outputs. Now it is also only used for standard MLP networks. LM (and NBN) algorithm converges much faster than EBP algorithm for small and media sized patterns training.

NBN: This is NBN algorithm with forward-backward computation. NBN algorithm is developed based on LM algorithm, but it can handle arbitrarily connected neuron (ACN) networks, also, the convergence is improved [1][2].

EBP, forward-only: This is EBP algorithm with forward-only computation [3]. It can work on arbitrarily connected neuron networks.

NBN, forward-only: This is NBN algorithm with forward-only computation. It can handle arbitrarily connected neuron networks and, as mentioned above, it works faster than “NBN” algorithm, especially for networks with multiple outputs.

NBN, improved: This is a newly developed second order algorithm, implemented with forward-only computation, so it can handle arbitrarily connected neuron networks. In this algorithm, Hessian matrix is inverted only one time per iteration, so this algorithm is supposed to compute faster than LM (and NBN) algorithm which may have several times Hessian matrix inversion per iteration. The train ability (convergence) is also improved in this algorithm.

The first two algorithms only work for limited networks, while the rest of algorithms can handle arbitrarily connected neuron (ACN) networks. In spite of “NBN, improved” (need further improvement), “EBP, forward-only” and “NBN, forward-only” are recommended for training. The former one could be used for huge patterns and networks training, while the latter one is suitable for small and media sized patterns training.

The following algorithms are currently implemented in the C++ version (2):

EBP: This is EBP algorithm with traditional forward-backward computation; for EBP algorithm, it may work a little bit faster than forward-only computation. Now it is only used for standard MLP networks. EBP algorithm converges slowly, but it can be used for huge patterns training.

LM: This is LM algorithm with traditional forward-backward computation; for LM (and NBN) algorithm, the improved forward-only computation performs faster training than forward-backward computation for networks with multiple outputs. Now it is also only used for standard MLP networks. LM (and NBN) algorithm converges much faster than EBP algorithm for small and media sized patterns training.

NBN: This is NBN algorithm with forward-backward computation. NBN algorithm is developed based on LM algorithm, but it can handle arbitrarily connected neuron (ACN) networks, also, the convergence is improved [1][2].

The first two algorithms only work for limited networks, while the rest of algorithms can handle arbitrarily connected neuron (ACN) networks. In spite of “NBN, improved” (need further improvement), “EBP, forward-only” and “NBN, forward-only” are recommended for training. The former one could be used for huge patterns and networks training, while the latter one is suitable for small and media sized patterns training.

References 

  1. B. M. Wilamowski, H. Yu, “Improved Computation for Levenberg Marquardt Training,” IEEE Trans. on Neural Networks, vol. 21, no. 6, pp. 930-937, June 2010.  http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5451114

  2. B. M. Wilamowski and H. Yu, “Neural Network Learning Without Backpropagation," IEEE Trans. on Neural Networks, 2010  (Early Access)  http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5580116

  3. B. M. Wilamowski, ” Neural Network Architectures and Learning algorithms”,  IEEE Industrial Electronics Magazine, vol 3, no 4, pp.56-63,  Nov 2009.  http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5352485

  4. B. M. Wilamowski, N. J. Cotton, O. Kaynak, G. Dundar, "Computing Gradient Vector and Jacobian Matrix in Arbitrarily Connected Neural Networks," IEEE Trans. on Industrial Electronics, vol. 55, no. 10, pp. 3784-3790, Oct 2008 http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4602720

  5. J. D. Hewlett, B. M. Wilamowski, G. Dundar, "Optimization Using a Modified Second-Order Approach With Evolutionary Enhancement," IEEE Trans. on Industrial Electronics, vol. 55, no. 9, pp. 3374-3380, Sept 2008 http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4559389

  6. H. Yu and B. M. Wilamowski, “Fast and efficient and training of neural networks,” in Proc. 3nd IEEE Human System Interaction Conf.  HSI 2010, Rzeszow, Poland, May 13-15, 2010, pp. 175-181. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5514571

  7. B. M. Wilamowski, " Challenges in Applications of Computational Intelligence in Industrial Electronics ISIE10 - International Symposium on Industrial Electronics,  Bari,  Italy,  July 4-7, 2010,  pp. 15-22.

  8. N. J. Cotton and Bogdan M. Wilamowski “Compensation of Sensors Nonlinearity with Neural Networks”, 24th IEEE International Conference on Advanced Information Networking and Applications 2010, pp. 1210-1217, April 2010. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5474850

  9. H. Yu and B.M. Wilamowski “Efficient and Reliable Training of Neural NetworksIEEE Human System Interaction Conference, HSI 2009, Catania. Italy, May 21-23, 2009, pp. 109 – 115. http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5090963

  10. H. Yu and B.M. Wilamowski “C++ Implementation of Neural Networks Trainer” 13-th IEEE Intelligent Engineering Systems Conference, INES 2009, Barbados, April 16-18,2009, pp. 257-262.  http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=4924772