### ASA 128th Meeting - Austin, Texas - 1994 Nov 28 .. Dec 02

## 2aAO1. Efficient navigation of parameter landscapes.

**Michael D. Collins
**

**
**
*Naval Res. Lab., Washington, DC 20375
*

*
*
The covariance matrix of the gradient of the cost function contains a
great deal of information about a parameter space. The eigenvectors of the
covariance matrix form an optimal basis (in the sense of data compression) for
the gradient. Since search algorithms base their decisions on the gradient
(often in an indirect fashion), the eigenvectors in some sense form an optimal
set of generators for navigating parameter landscapes. For problems involving a
long valley, there is usually an eigenvector oriented parallel to the valley.
Search algorithms based on the optimal generators may find the deepest point in
the valley several times faster than algorithms based on other generators. The
covariance matrix also contains information about the key underlying
parameters. The most important parameters correspond to the eigenvectors
associated with the largest eigenvalues. This information can be exploited to
reparametrize with a smaller number of parameters. The covariance matrix is the
integral of the outer product of the gradient over the parameter space.
Obtaining a good estimate of this integral with the Monte Carlo method usually
requires relatively little effort, even for high-dimensional parameter spaces.
Examples are presented for geoacoustic inverse problems involving acoustic
sources and receivers located in the ocean.