TITLE: Optimized Combination, Regularization, and Pruning in Parallel Consensual Neural Networks

AUTHOR: J.A. Benediktssona), J. Larsenb), J.R. Svienssona) & L.K. Hansenb)

Department of Electrical and Computer Engineering
University of Iceland
Hjardarhaga 2-6, 107 Reyjkavik, Iceland
email: benedikt@hi.is

connect, Department of Mathematical Modelling, Building 321
Technical University of Denmark, DK-2800 Lyngby, Denmark
emails: jl,lkhansen@imm.dtu.dk
www: http://eivind.imm.dtu.dk


Optimized combination, regularization, and pruning is proposed for the Parallel Consensual Neural Networks (PCNNs) which is a neural network architecture based on the consensus of a collection of stage neural networks. trained on the same input data with different representations. Here, a regularization scheme is presented for the PCNN which iteratively adapts the regularization parameters by minimizing the validation error. The use of this adaptive regularization scheme in conjunction with Optimal Brain Damage pruning is suggested both to optimize the architecture of the individual stage networks and to avoid overfitting. Experiments are conducted on a multisource remote sensing and geographic data set consisting of six data source. The results obtained by the proposed version of PCNN are compared to other classification approaches such as the original PCNN, single stage neural networks and statistical classifiers. In comparison to the originally proposed PCNNs, the use of pruning and regularization not only produces simpler PCNNs but also gives higher classification accuracies. In particular, a neural network based non-linear combination scheme for the individual stages in the PCNN produces excellent overall classification accuracies for both training test data.

In Proceedings of European Symposium on Remote Sensing, vol. 3500, Barcelona, Spain, 21-25, Sept. 1998.