Nonlinear Functional Regression: A Functional RKHS Approach

3y ago
74 Views
2 Downloads
639.85 KB
7 Pages
Last View : 2d ago
Last Download : 9m ago
Upload by : Tia Newell
Transcription

Nonlinear functional regression:a functional RKHS approachHachem KadriSequeL ProjectINRIA Lille - Nord EuropeVilleneuve d’Ascq, FranceEmmanuel DuflosPhilippe PreuxSequel Project/LAGISSequel Project/LIFLINRIA Lille/Ecole Centrale de Lille INRIA Lille/Université de LilleVilleneuve d’Ascq, FranceVilleneuve d’Ascq, FranceStéphane CanuLITISINSA de RouenSt Etienne du Rouvray, FranceManuel DavyLAGIS/Vekia SASEcole Centrale de LilleVilleneuve d’Ascq, FranceAbstractThis paper deals with functional regression,in which the input attributes as well as the response are functions. To deal with this problem, we develop a functional reproducing kernel Hilbert space approach; here, a kernel isan operator acting on a function and yieldinga function. We demonstrate basic propertiesof these functional RKHS, as well as a representer theorem for this setting; we investigatethe construction of kernels; we provide someexperimental insight.1IntroductionWe consider functional regression in which data attributes as well as responses are functions: in this setting, an example is a couple (xi (s), yi (t)) in which bothxi (s), and yi (t) are real functions, that is xi (s) Gx ,and yi (t) Gy where Gx , and Gy are real Hilbertspaces. We notice that s and t can belong to differentsets. This setting naturally appears when we wish topredict the evolution of a certain quantity in relationto some other quantities measured along time. It isoften the case that this kind of data are discretizedso as to deal with a classical regression problem inwhich a scalar value has to be predicted for a set ofvectors. It is true that the measurement process itselfAppearing in Proceedings of the 13th International Conference on Artificial Intelligence and Statistics (AISTATS)2010, Chia Laguna Resort, Sardinia, Italy. Volume 9 ofJMLR: W&CP 9. Copyright 2010 by the authors.374very often provides a vector rather than a function, butthe vector is really a discretization of a real attribute,which is a function. Furthermore, if the discretizationstep is small, the vectors may become very large. Toget better idea about typical functional data and related statistical tasks, figure 1 presents temperatureand precipitation curves observed at 35 weather stations of Canada (Ramsay and Silverman, 2005) wherethe goal is to predict the complete log daily precipitation profile of a weather station from information ofthe complete daily temperature profile. We think thathandling these data as what they really are, that isfunctions, is at least an interesting path to investigate;moreover, conceptually speaking, we think it is thecorrect way to handle this problem. Functional dataanalysis research can be largely classified into threemethodologies based on different concepts: smoothing (Ramsay and Silverman, 2005), functional analysis (Ferraty and Vieu, 2006) and stochastic process (Heet al., 2004; Preda et al., 2007). Using functional analysis (Rudin, 1991), observational unit is treated as anelement in a function and functional analysis conceptssuch as operator theory are used. In stochastic processmethodology, each functional sample unit is consideredas a realization from a random process. This workbelongs to the functional analysis methodology. Topredict infinite dimensional responses from functionalfactors we extend works on vector-valued kernel (Micchelli and Pontil, 2005a,b) to functional kernel. Thislead us to generalize the notions of kernel and reproducing kernel Hilbert space (RKHS) to operators andfunctional RKHS. As a first step, in this paper, weinvestigate the use of an l2 error measure, along withthe use of an l2 regularizer. We show that classicalresults on RKHS may be straitforwardly extended to

Nonlinear functional regression: a functional RKHS approachfunctional RKHS (Lian, 2007; Preda, 2007); the representer theorem is restated in this context; the construction of operator kernels is also discussed, and weexhibit a counterpart of the Gaussian kernel for thissetting. These foundations having been laid, we haveinvestigated the practical use of these results on sometest problems.to use a regularization term (Vapnik, 1998). Therefore, the solution of the problem is the f F thatminimizes the regularized empirical risk Jλ (f )Jλ : Ff Rn 7 kyi f (xi )k2Gy λkf k2F(1)i 12012101008mmDeg Cwhere λ R is the regularization parameter. 10In the case of scalar data, it is well-known (Wahba,1990) that under general conditions on real RKHS, thesolution of this minimization problem can be writtenas:n f (x) wi k(xi , x), wi R.64 202i 1 ere k is the reproducing kernel of a real Hilbertspace. An extension of this solution to the domain offunctional data takes the following form:350(b)(a)Figure 1: Daily weather data for 35 Canadian station.(a) Temperature. (b) Precipitation.Works that have dealt with functional regression arevery few. There is mostly the work of Ramsay andSilverman (2005), which is a linear approach to functional regression. With regards to our approach whichis nonlinear, Ramsay et al.’s work deals with parametric regression, and it is not grounded on RKHS. Lian(2007) may be seen as a first step of our work; however, we provide a set of new results (theorems 1 and2), a new demonstration of the representer theorem(in the functional case), and we study the construction of kernels (Sec. 3.1), a point which is absent fromLian’s work where the kernel is restricted to a (scaled)identity operator, though it is a crucial point for anypractical use.2RKHS and functional dataThe problem of functional regression consists in approximating an unknown function f : Gx Gyfrom functional data (xi (s), yi (t))ni 1 Gx Gy whereGx : Ωx R and Gy : Ωy R such as yi (t) f (xi (s)) i (t), with i (t) some functional noise. Assuming that xi and yi are functions, we consider asa real reproducing kernel Hilbert space equipped withan inner product. Considering a functional Hilbertspace F, the best estimate f F of f is obtained byminimizing the empirical risk defined by:n f (.) n KF (xi (s), .)βi (t)i 1where functions βi (t) are in Gy and the reproducingkernel functional Hilbert space KF is an operatorvalued function.In the next subsections and in Sec. 3, basic notions andproperties of real RKHS are generalized to functionalRKHS. In the remaining of this paper, we use simplified notations xi and yi instead of xi (s) and yi (t)2.1Functional Reproducing Kernel HilbertSpaceLet L(G y ) the set of bounded operators from Gy toGy . Hilbert spaces of scalar functions with reproducing kernels were introduced and studied in Aronszajn(1950). In Micchelli and Pontil (2005a), Hilbert spacesof vector-valued functions with operator-valued reproducing kernels for multi-task learning (Micchelli andPontil, 2005b) are constructed. In this section, we outline the theory of reproducing kernel Hilbert spaces(RKHS) of operator-valued functions (Senkene andTempel’man, 1973) and we demonstrate some basicproperties of real RKHS which are restated for functional case.Definition 1 An L(G y )-valued kernel KF (w, z) on Gxis a function KF (., .) : Gx Gx L(G y ); KF is Hermitian if KF (w, z) KF (z, w) ,kyi fˆ(xi )k2Gy , fˆ F.i 1Depending on F, this problem can be ill-posed and aclassical way to turn it into a well-posed problem is375 it is nonnegative on Gx if it is Hermitian and forevery natural number r and all {(wi , ui )i 1,.,r } Gx Gy , the block matrix with ij-th entryhKF (wi , wj )ui , uj iGy is nonnegative.

Kadri, Duflos, Preux, Canu, DavyDefinition 2 A Hilbert space F of functions from Gxto Gy is called a reproducing kernel Hilbert space ifthere is a nonnegative L(G y )-valued kernel KF (w, z)on Gx such that:i. the function z 7 KF (w, z)g belongs to F forevery choice of w Gx and g Gy ,ii. for every f F, hf, KF (w, .)giF hf (w), giGy .2.2The representer theoremIn this section, we state and prove an analog of therepresenter theorem for functional data.Theorem 3 Let F a functional reproducing kernelHilbert space. Consider an optimization problem basedin minimizing the functional Jλ (f ) defined by equation 1. Then, the solution f F has the followingrepresentation:On account of (ii), the kernel is called the reproducingkernel of F, it is uniquely determined and the functionsin (i) are dense in F.Theorem 1 If a Hilbert space F of functions on Gyadmits a reproducing kernel, then the reproducing kernel KF (w, z) is uniquely determined by the Hilbertspace F.Elements of Proof. Let KF (w, z) be a reproducingkernel of F. Suppose that there exists another kernel0KF(w, z) of F. Then, for all w, w0 , h and g Gx , applying the reproducing property for K and K 0 we gethK 0 (w0 , .)h, K(w, .)giF hK 0 (w0 , w)h, giGy . We showalso that hK 0 (w0 , .)h, K(w, .)giF hK(w0 , w)h, giGy .Theorem 2 A L(G y )-valued kernel KF (w, z) on Gxis the reproducing kernel of some Hilbert space F, ifand only if it is positive definite.Elements of Proof. Necessity. Let KF (w, z), w, z Gx be the reproducing kernel of a Hilbert space F. Using the reproducing property of the kernel KF (w, z) wenn obtainhKF (wi , wj ), uj iGy kKF (wi , .)ui k2Fi,j 1i 1for any {wi , wj } Gx , and {ui , uj } Gy .Sufficiency. Let F0 the space of all Gy -valued functionsn f of the form f (.) KF (wi , .)αi where wi Gx andi 1αi Gy , i 1, . . . , n. We define the inner product ofthe functions f and g from F0 as follows:hf (.), g(.)iF0 h n i 1n KF (wi , .)αi ,i,j 1n j 1KF (zj , .)βj iF0hKF (wi , zj )αi , βj iGyf (.) 376KF (xi , .)βii 1with βi Gy .0Elements of proof. We compute Jλ (f ) using thedirectional derivative defined by:Dh Jλ (f ) limτ 0Jλ (f τ h) Jλ (f )τSetting the result to zero and using the fact thatDh Jλ (f ) h Jλ (f ), hi complete the proof of the theoremWith regards to the classical representer theorem inthe case of real RKHS’s, here the kernel K is an operator, and the “weights” βi are functions (from Gx toGy ).3Functional nonlinear regressionIn this section, we detail the method used to computethe regression function of functional data. To do this,we assume that the regression function belongs to a reproducing kernel functional Hilbert space constructedfrom a positive functional kernel. We already shownin theorem 2 that it is possible to construct a prehilbertian space of functions in real Hilbert space froma positive functional kernel and with some additionalassumptions it can be completed to obtain a reproducing kernel functional Hilbert space. Therefore, itis important to consider the problem of constructingpositive functional kernel.3.1We show that (F0 , h., .iF0 ) is a pre-Hilbert space. Thenwe complete this pre-Hilbert space via Cauchy sequences to construct the Hilbert space F of Gy -valuedfunctions. Finally, we conclude that F is a reproducing kernel Hilbert space, since F is a real inner productspace that is complete under the norm k.kF defined bykf (.)kF lim kfn (.)kF0 , and has KF (., .) as repron ducing kernel.n Construction of the functional kernelIn this section, we discuss the construction of functional kernels KF (., .). To construct a functional kernel, one can attempt to build an operator T h L(G y )from a function h Gx (Canu et al., 2003). We call hthe characteristic function of the operator T h (Rudin,1991). In this first step, we are building a functionf : Gx L(G y ). The second step may be achievedin two ways. Either we build h from a combinationof two functions h1 and h2 in H, or we combine two

Nonlinear functional regression: a functional RKHS approachoperators created in the first step using the two characteristic functions h1 and h2 . The second way is moredifficult because it requires the use of a function whichoperates on operator variables. Therefore, in this workwe only deal with the construction of functional kernels using a characteristic function created from twofunctions in Gx .The choice of the operator T h plays an important rolein the construction of a functional RKHS. Choosing Tpresents two major difficulties. Computing the adjointoperator is not always easy to do, and then, not alloperators verify the Hermitian condition of the kernel.The kernel must be nonnegative: this property is givenaccording to the choice of the function h. The Gaussian kernel is widely used in real RKHS. Here, we discuss the extension of this kernel to functional data domains. Suppose that Ωx Ωy and then Gx Gy G.Assuming that G is the Hilbert space L 2 (Ω) over R endowed with an inner product hφ, ψi Ω φ(t)ψ(t)dt, aL(G)-valued gaussian kernel can be written as:KF : G Gx, y(KF (y, x) z)(t) T exp(c (y x) ) z(t) exp(c (x(t) y(t))2 z(t) (KF (x, y)z)(t)2The nonnegativity of theas follows. Let K(x, y)T exp(β xy) . We show usingexponential function thatβ2kernel KF can be shownbe the kernel defined bya Taylor expansion for theK is a nonnegtive kernel.2T exp(β xy) T 1 βxy 2! (xy) . for all β 0, thus itis sufficient to show that T βxy is nonnegative to obtaintheof K. It is not difficult to verify that nonnegativityhT βwi wj ui , uj i β k wi ui k2 0 which impliesithat T βxy is nonnegative. Now take Using the functional exponential kernel defined in thesection 3.1, we are able to solve the minimization problemminn f F i 1kyi f (xi )k2Gy λkf k2Fusing the representer theoremnn minkyi KF (xi , xj )βj k2Gyβi i 1n j 1 λkj 1KF (., xj )βj k2Fusing the reproducing propertynn minkyi KF (xi , xj )βj k2βi i 1n (KF (xi , xj )βi , βj )Gi,jn minj 1βi i 1kyi n j 1cij βj k2Gy λn hcij βi , βj iGi,jThe operator cij is computed using the function parameter h of the kernelIt easy to see that hT h x, yi hx, T h yi, then T h is aself-adjoint operator. Thus KF (y, x) KF (y, x) andKF is Hermitian sinceKF (x, y)Regression function estimate λ G7 Txh ; Txh (t) h(t)x(t)i,j3.2 L(G)27 T exp(c.(x y) )where c 0 and T h L(G) is the operator definedby:Th : GxSince K is nonnegative, we conclude that the kernel2KF (x, y) T exp(c.(x y) ) is nonnegative. It is alsoHermitian and then KF is the reproducing kernel of afunctional Hilbert space.2T exp(c (x y) )22T exp(c y ). exp( 2c xy). exp(c x )22T exp(c y ) T exp( 2c xy) T exp(c x )then 2hKF (wi , wj )ui , uj i hT exp(c (wi wj ) ) ui , uj ii,ji,j 22 hT exp(c wj ) T exp( 2c wi wj ) T exp(c wi ) ui , uj ii,j 22 hT exp( 2c wi wj ) T exp(c wi ) ui , T exp(c wj ) uj i 0cij h(xi , xj ) exp(c(xi xj )2 ) , c R We note that the minimization problem becomes alinear multivariate regression problem of yi on cij .In practice the functions are not continuously measured but rather obtained by measurements at discretepoints, {t1i , . . . , tpi } for data xi ; then the minimizationproblem takes the following form.minβipn (yi (tli ) i 1 l 1pn λi,j l377j 1)2cij (tlij )βj (tlj )(2)cij (tlij )βi (tli )βj (tlj )The expression (2) looks similar to the ordinarysmoothing spline estimation (Lian, 2007; Wahba,1990). A specific formula for the minimizer of thisexpression can be developed using the same methodas for the computation of the smoothing spline coefficient. Taking the discrete measurement points offunctions x and y, the estimates βbi;1 i n of functionsβi;1 i n can be computed as follows. First, let C bethe np np matrix defined by 1C···0 . .C . 0i,jn ···Cp

Kadri, Duflos, Preux, Canu, Davywhere C l (cij (tlij ))1 i n ; 1 j n for l 1, . . . , p.Then define in the same way the np p matrices Yand β using Y l (yi (tli ))1 i n and β l (βi (tli ))1 i n .Now take the matrix formulation of the expression (2)min trace((Y Cβ)(Y Cβ) )Figure 2 show an example of constructing these data.Subplots (b) and (d) represent respectively factors andresponses of the functional model obtained using thefollowing nonlinear bivariate functions f1 and f2 :f1 (a, b) peaks2 (a, b)f2 (a, b) 10 x. exp( a2 b2 )Tβ λ trace(Cββ T )(3)where the operation trace is defined as trace(A) aii101055iTaking the derivative of (3) with respect to matrix β,we find that β satisfies the system of linear equations(C λI)β Y .00 5251 5 500 50(b)50(d)5(a)4Experiments55In order to evaluate the proposed RKHS functional regression approach, experiments on simulated data andmeteorological data are carried out. Results obtainedby our approach are compared with a B-spline implementation of functional linear regression model forfunctional responses (Ramsay and Silverman, 2005).The implementation of this functional linear model isperformed using the fda package1 provided in Matlab.In these experiments, we use the root residual sumof squares (RRSS) to quantify the estimation error offunctional regression approaches. It is a measure of thediscrepancy between estimated and true curves. To assess the fit of estimated curves, we consider an overallmeasure for each individual functional data, definedby RRSSi {byi (t) yi (t)}2 dtIn Ramsay and Silverman (2005), the authors proposethe use of the squared correlation function to evaluate function estimate of functional regression methods. This measure takes into account the shape of allresponse curves in assessing goodness of fit. In ourexperiments we use the root residual sum of squaresrather than squared correlation function since RRSS ismore suitable to quantify the estimate of curves whichcan be dissimilar from factors and responses.4.1Simulation studyTo illustrate curves estimated using our RKHS approach and the linear functional regression model, weconstruct functional factors and responses using multiple cut planes through three-dimensional functions.1fda package is available l00 525100 5 5 5(c)Figure 2: Simulated data set. (a) and (c) Plot of thefunction f1 and f2 in a three dimensional Cartesian coordinate system, with axis lines a, b and c. (b) and (d)factor and response curves obtained by 11 cut planesof f1 and f2 parallel to a and c axes at fixed b values.Equispaced grids of 50 points on [-5, 5] for a and of 20points on [0, 2] for b are used to compute f1 and f2 .These function are represented in a three dimensionalCartesian coordinate system, with axes lines a, b andc (see figure 2 subplots (a) and (c)). Factors xi (s) andresponses yi (t) are generated by 11 cut planes parallelto a and c axes at fixed b values. xi (s)i 1,.,11 andyi (t)i 1,.,11 are then defined as the following:xi (s) peaks(s, αi )yi (t) 10 t. exp( t2 γi2 )Figure 3 illustrates the estimation of a curve obtainedby a cut plane through f2 at a y value outside thegrid and equal to 10.5. We represent in this figurethe true curve to be estimated, the linear functionalregression (LRF) estimate and our RKHS estimate.Using RKHS estimate we can fit better the true curvethan LRF estimate and reduce the RRSS value from2.07 to 0.94.2peaks is a Matlab function of two variables, obtainedby translating and scaling Gaussian distributions.378

Nonlinear functional regression: a functional RKHS approach2True CurveLRF estimateRKHS estimate1.510.50 0.5 1 1.5 2 505Figure 3: True Curve (triangular mark), LFR prediction (circle mark) and RKHS prediction (star mark)of a curve obtained by a cut plane through f2 at a yvalue equal to 10.5.4.2Application to the weather dataOn these grounds, different important issues are currently under study. All these issues have been studiedfor classical (scalar) regression, in classical RKHS, inthe last two decades: have more than one attribute in data,Ramsay and Silverman (2005) introduce the CanadianTemperature data set as one of their main examplesof functional data. For 35 weather stations, the dailytemperature and precipitation were averaged over aperiod of 30 years. The goal is to predict the completelog daily precipitation profile of a weather station frominformation on the complete daily temperature profile.To demonstrate the performance of the proposedRKHS functional regression method, we illustrate inFigure 4 the prediction of our RKHS estimate andLFR estimate

Using functional anal-ysis (Rudin, 1991), observational unit is treated as an element in a function and functional analysis concepts such as operator theory are used. In stochastic process methodology, each functional sample unit is considered as a realization from a random process. This work belongs to the functional analysis methodology. To predict infinite dimensional responses from .

Related Documents:

independent variables. Many other procedures can also fit regression models, but they focus on more specialized forms of regression, such as robust regression, generalized linear regression, nonlinear regression, nonparametric regression, quantile regression, regression modeling of survey data, regression modeling of

There are 2 types of nonlinear regression models 1 Regression model that is a nonlinear function of the independent variables X 1i;:::::;X ki Version of multiple regression model, can be estimated by OLS. 2 Regression model that is a nonlinear function of the unknown coefficients 0; 1;::::; k Can't be estimated by OLS, requires different .

There are 2 types of nonlinear regression models 1 Regression model that is a nonlinear function of the independent variables X 1i;:::::;X ki Version of multiple regression model, can be estimated by OLS. 2 Regression model that is a nonlinear function of the unknown coefficients 0; 1;::::; k Can't be estimated by OLS, requires different .

Alternative Regression Methods for LSMC » Examples of linear and nonlinear regression methods: -Mixed Effects Multiple Polynomial Regression -Generalized Additive Models -Artificial Neural Networks -Regression Trees -Finite Element Methods » In other work we have considered local regression methods such as -kernel smoothing and

Nonlinear Regression The term "nonlinear" regression, in the context of this job aid, is used to describe the application of linear regression in fitting nonlinear patterns in the data. The techniques outlined here are offered as samples of the types of approaches used to fit patterns that some might refer to as being "curvilinear" in .

Nonlinear Regression II Objective and Learning Outcomes Objective I Introduction to nonlinear regression methods for multivariate scalar models. Learning Outcomes I You will understand I normal equation (Gauss-Newton / Levenberg-Marquardt methods) I Givens rotation I Householder re ection in the context of linearization of nonlinear regression .

LINEAR REGRESSION 12-2.1 Test for Significance of Regression 12-2.2 Tests on Individual Regression Coefficients and Subsets of Coefficients 12-3 CONFIDENCE INTERVALS IN MULTIPLE LINEAR REGRESSION 12-3.1 Confidence Intervals on Individual Regression Coefficients 12-3.2 Confidence Interval

Interpretation of Regression Coefficients The interpretation of the estimated regression coefficients is not as easy as in multiple regression. In logistic regression, not only is the relationship between X and Y nonlinear, but also, if the dependent variable has more than two unique values, there are several regression equations.