Neural Network Studies. 3. Variable Selection in the Cascade-Correlation Learning Architecture

Pruning methods for feed-forward artificial neural networks trained by the cascade-correlation learning algorithm are proposed. The cascade-correlation algorithm starts with a small network and dynamically adds new nodes until the analyzed problem has been solved. This feature of the algorithm remov...

Full description

Saved in:
Bibliographic Details
Published inJournal of Chemical Information and Computer Sciences Vol. 38; no. 4; pp. 651 - 659
Main Authors Kovalishyn, Vasyl V, Tetko, Igor V, Luik, Alexander I, Kholodovych, Vladyslav V, Villa, Alessandro E. P, Livingstone, David J
Format Journal Article
LanguageEnglish
Published American Chemical Society 01.07.1998
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Pruning methods for feed-forward artificial neural networks trained by the cascade-correlation learning algorithm are proposed. The cascade-correlation algorithm starts with a small network and dynamically adds new nodes until the analyzed problem has been solved. This feature of the algorithm removes the requirement to predefine the architecture of the neural network prior to network training. The developed pruning methods are used to estimate the importance of large sets of initial variables for quantitative structure−activity relationship studies and simulated data sets. The calculated results are compared with the performance of fixed-size back-propagation neural networks and multiple regression analysis and are carefully validated using different training/test set protocols, such as leave-one-out and full cross-validation procedures. The results suggest that the pruning methods can be successfully used to optimize the set of variables for the cascade-correlation learning algorithm neural networks. The use of variables selected by the elaborated methods provides an improvement of neural network prediction ability compared to that calculated using the unpruned sets of variables.
Bibliography:ark:/67375/TPS-W7R7XCT5-4
istex:87BBE67C660E5FAA334E1CB4E0CBB9E077B471EC
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0095-2338
1549-960X
1520-5142
DOI:10.1021/ci980325n