 |
The development of an architecture taxonomy based on formulating
a canonical filter representation. The substantial part of the
taxonomy is the distinction between global and local models.
The taxonomy leads to the classification of a number of existing
neural network
architectures and, in addition, suggests the potential
development of novel structures. Various architectures are reviewed
and interpreted. Especially we attach importance to
interpretations of
the multi-layer perceptron neural network.
|
|
Formulation of a generic nonlinear filter architecture which consists of
a combination of the canonical filter and a preprocessing unit. The
architecture may be viewed as a heterogeneous three-layer neural network.
A number of preprocessing methods are suggested with reference to
bypassing the ``curse of dimensionality'' without reducing the
performance significantly.
|
|
Discussion of various algorithms for estimating characteristic model
weights (parameters). We suggest efficient implementations
of standard first and second order optimization algorithms for
layered architectures.
In addition, in order to speed-up convergence a weight
initialization algorithm for the 2-layer perceptron neural networks is
developed.
|
|
Clarification and discussion of fundamental limitations in the
search for optimal network architectures based upon a
decomposition of the average generalization error, called the
model error decomposition. This includes a discussion of employing
regularization.
|
|
The development and discussion of a novel generalization error
estimator, GEN,
which is valid for incomplete, nonlinear models. The ability to deal
with incomplete models is particularly important when performing ``black
box'' modeling. The models are assumed to be estimated by minimizing the
least squares cost function with a regularization term.
The estimator is based on a statistical framework and may be viewed
as an extension of Akaike's classical FPE-estimator and Moody's
GPE-estimator.
|
|
Development of various statistical based pruning procedures which
generalize the Optimal Brain Damage and the Optimal Brain Surgeon
procedures.
|