Lecture 2: Nonlinear Regression

2y ago
107 Views
3 Downloads
1.51 MB
17 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Aarya Seiber
Transcription

Experimental data analysisLecture 2: Nonlinear regressionDodo Das

Review of lecture 1Likelihood of a model.Likelihood maximization Normal errors Least squaresregressionLinear regression. Normal equations.

Demo 1: Simple linear regression in MATLAB

Demo 1: Simple linear regression in MATLAB

Demo 1: Simple linear regression in MATLAB

Demo II: Polynomial regression in MATLAB

Demo II: Polynomial regression in MATLAB

Nonlinear regressionThe model is a nonlinear function of the parameters.We can still write down the likelihood as before.But the maximum likelihood equations cannot be solvedanalytically.

Iterative least-squared minimizationChoose an initial guess for the parameters.Evaluate SSR.Propose a move in parameter space.If move reduces SSR, then update parameter values.Otherwise, propose a different move.

How to choose the move in parameter space?Gradient descent: Far from a minima, it is best to find thegradient (i.e. direction of steepest descent), and move downthe gradient of the SSR function.Gauss-Newton: Near a minima, construct a Taylor-seriesapproximation of the function (to 2nd order) and determinethe location of the minima.

A compromise - Levenberg-MarquardtSwitches between Gradient descent when far from minima,and to Gauss-Newton when close to minima.ed dose-response data (solid circles) generated from a

Practical considerationsNeed to specify initial guess.Can be trapped in local minima if initial guess is not good.Try a number of random initial guesses, and pick the finalresult that has the lowest SSR.If computationally feasible, good to plot the SSR landscapeover some reasonable parameter range.

Other likelihood maximization schemesBased on stochastic simulations:Markov chain Monte Carlo (MCMC)Simulated annealingAlso, many other optimization techniques [Major branch ofapplied math].

An example of MCMC

1. Visual inspection. As a first step, it is useful to examine the best fit curve overlaid on theensure that indeed the fit closely approximates the data. As seen in Fig. 3a, the smootindeed qualitatively match the simulated dose response curve.Diagnostics: Asessing quality of fitsVisual assessment: Does the2. Randomness of residuals. The best fit curve represents the predicted value of the responsefit lookmodel,reasonable?we expect the experimental data to be randomly distributed about the best fit curthe residuals should be randomly positive or negative. As an example, note how the rethe dose response data (Fig. 3b, solid blue lines) are randomly distributed about the horizAreSystematicthe parametersdeviations from such randomness are the hallmark of a poor fit, and suggestestimatesphysicallymodel. Forexample, we also plot residuals between the data points and the average o(Fig. 3b, solid red lines). Note how these residuals are distributed very differently, wpossible?values to the left of the midpoint, and mostly positive values to the right. Also note hothese residuals are the greatest near the two ends, and smallest near the middle. TheseQuantify:R2 average of all the responses is not a sufficiently good descriptor for these dthat a simple3. Coefficient of determination (R2 ). Beyond these qualitative features, a more objective meafit is the coefficient of determination, R2 . It is defined as,R2 1 SSR/SST,where SST is the total sum of squares (SST),N!#2"(yi )observed ȳobserved ,SST i 1

Diagnostics: Asessing quality of fitsFigure 3: Overview of fitting data to a model. A) Simulated dose-resHill function (equation 1) using parameter values Emax 0.8, LEC5distributednoise(with randomlya standarddistributed?deviation of 0.15) added to mimic eAre theresidualsline) is obtained from nonlinear least squares regression between theTable 1 for the best-fit parameter estimates. Nonlinear data fitting algorthat minimize the distance between the data and the curve. The distance

TomorrowParameter confidence intervals.Bootstrap.Comparing parameters from two different fits - Hypothesistesting.

Lecture 2: Nonlinear regression Dodo Das. Review of lecture 1 Likelihood of a model. Likelihood maximization Normal errors Least squares regression Linear regression. Normal equations. Demo 1: Simple linear regression in MATLAB. Dem

Related Documents:

independent variables. Many other procedures can also fit regression models, but they focus on more specialized forms of regression, such as robust regression, generalized linear regression, nonlinear regression, nonparametric regression, quantile regression, regression modeling of survey data, regression modeling of

Introduction of Chemical Reaction Engineering Introduction about Chemical Engineering 0:31:15 0:31:09. Lecture 14 Lecture 15 Lecture 16 Lecture 17 Lecture 18 Lecture 19 Lecture 20 Lecture 21 Lecture 22 Lecture 23 Lecture 24 Lecture 25 Lecture 26 Lecture 27 Lecture 28 Lecture

There are 2 types of nonlinear regression models 1 Regression model that is a nonlinear function of the independent variables X 1i;:::::;X ki Version of multiple regression model, can be estimated by OLS. 2 Regression model that is a nonlinear function of the unknown coefficients 0; 1;::::; k Can't be estimated by OLS, requires different .

There are 2 types of nonlinear regression models 1 Regression model that is a nonlinear function of the independent variables X 1i;:::::;X ki Version of multiple regression model, can be estimated by OLS. 2 Regression model that is a nonlinear function of the unknown coefficients 0; 1;::::; k Can't be estimated by OLS, requires different .

3 LECTURE 3 : REGRESSION 10 3 Lecture 3 : Regression This lecture was about regression. It started with formally de ning a regression problem. Then a simple regression model called linear regression was discussed. Di erent methods for learning the parameters in the model were next discussed. It also covered least square solution for the problem

Alternative Regression Methods for LSMC » Examples of linear and nonlinear regression methods: -Mixed Effects Multiple Polynomial Regression -Generalized Additive Models -Artificial Neural Networks -Regression Trees -Finite Element Methods » In other work we have considered local regression methods such as -kernel smoothing and

Nonlinear Regression The term "nonlinear" regression, in the context of this job aid, is used to describe the application of linear regression in fitting nonlinear patterns in the data. The techniques outlined here are offered as samples of the types of approaches used to fit patterns that some might refer to as being "curvilinear" in .

32.33 standards, ANSI A300:Performance parameters established by industry consensus as a rule for the measure of quantity, weight, extent, value, or quality. 32.34 supplemental support system: Asystem designed to provide additional support or limit movement of a tree or tree part. 32.35 swage:A crimp-type holding device for wire rope. 32.36 swage stop: Adevice used to seal the end of cable. 32 .