Article quick-view

Research paper by

Jeffrey M. Hokanson, Paul G. Constantine

Indexed on

20th Feb 2017

Published on

20th Feb 2017

Published in

arXiv - Mathematics - Numerical Analysis

This paper is not available for free

You can find the original article on the journal website.

Go to sourceInexpensive surrogates are useful for reducing the cost of science and engineering studies with large-scale computational models that contain many input parameters. A ridge approximation is a surrogate distinguished by its model form: namely, a nonlinear function of a few linear combinations of the input parameters. Parameter studies (e.g., optimization or uncertainty quantification) with ridge approximations can exploit its low-dimensional structure by working on the coordinates of the subspace defined by the linear combination weights, reducing the effective dimension. We introduce a new, fast algorithm for constructing a least-squares-fit polynomial ridge approximation from function samples. Naively, this would require optimizing both the polynomial coefficients and the subspace. However, given a fixed subspace the optimal polynomial coefficients solve a linear least-squares problem. Our proposed method exploits this structure by implicitly computing these coefficients using variable projection, which leaves an optimization problem over the subspace alone. We present an algorithm that finds the optimal subspace by optimizing over the Grassmann manifold using a Gauss-Newton Hessian approximation. We provide the details of the optimization algorithm, and we demonstrate its performance on several numerical examples. The Gauss-Newton method has superior theoretical guarantees and faster convergence than an alternating heuristic for ridge approximation proposed by Constantine, Eftekhari, and Ward [arXiv 1606.01929] that (i) optimizes the polynomial coefficients given the subspace and (ii) optimizes the subspace given the coefficients.