New Constructive Neural Network Architecture for Pattern Classification
S. S. Sridhar and M. Ponnavaikko
DOI : 10.3844/jcssp.2009.843.848
Journal of Computer Science
Volume 5, Issue 11
Problem statement: Constructive neural network learning algorithms provide optimal ways to determine the architecture of a multi layer perceptron network along with learning algorithms for determining appropriate weights for pattern classification problems. These algorithms initially start with small network and dynamically allow the network to grow by adding and training neurons as needed until a satisfactory solution is found. The constructive neural network training is performed via feed forward paradigm under supervised training considerations. These supervised methods often make the network size grow exponentially, or, the network lacks generalization. To address these problems a new method for learning in constructive neural networks is necessary. Approach: To address these issues a new Multicategory Tiling architecture was chosen for its simple topology and an improved adaptive resonance theory unsupervised training algorithm was used with proper weight setting to train the constructive networks on binary sequence patterns. The results and performance of the new algorithm was compared with existing constructive neural network architectures and tabulated. Results: The new architecture with improved training algorithm offer faster convergence in learning, the nodes required for storage are less and the generalization of pattern classification was achieved in comparison with existing algorithms. Conclusion: Constructive neural networks could be trained using unsupervised algorithm to achieve better performance in comparison with existing supervised algorithms.
© 2009 S. S. Sridhar and M. Ponnavaikko. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.