Nonlinear sufficient dimension reduction for functional data
We propose a general theory and the estimation procedures for nonlinear sufficient dimension reduction where the predictor or the response, or both, are random functions. The relation between the response and predictor can be arbitrary and the sets of observed time points can vary from subject to subject. The functional and nonlinear nature of the problem leads naturally to consideration of two layers of functional
spaces: the first space consisting of functions of time; the second space consisting of functions defined on the first space. We take both spaces to be reproducing kernel Hilbert spaces. A particularly attractive feature of our construction is that the two functional
paces are nested, so that the kernel for the first space determines the kernel for the second.
We propose two estimators, functional generalized sliced inverse regression and functional generalized sliced average variance estimator for this general dimension reduction problem. We investigated the performances of our estimators by simulations, and applied them to data sets about phoneme recognition and handwritten symbols.