Curve Fitting

1y ago
22 Views
7 Downloads
1.01 MB
47 Pages
Last View : 14d ago
Last Download : 3m ago
Upload by : Lee Brooke
Transcription

The Islamic University of GazaFaculty of EngineeringCivil Engineering DepartmentNumerical AnalysisECIV 3306Chapter 17Least Square Regression

Part 5 - CURVE FITTINGDescribes techniques to fit curves (curve fitting) to discretedata to obtain intermediate estimates.There are two general approaches for curve fitting: Least Squares regression:Data exhibit a significant degree of scatter. The strategy isto derive a single curve that represents the general trendof the data. Interpolation:Data is very precise. The strategy is to pass a curve or aseries of curves through each of the points.

IntroductionIn engineering, two types of applications areencountered:– Trend analysis. Predicting values of dependentvariable, may include extrapolation beyond datapoints or interpolation between data points.– Hypothesistesting.Comparingexistingmathematical model with measured data.

Mathematical Background Arithmetic mean. The sum of the individual datapoints (yi) divided by the number of points (n).y y ni, i 1, , n Standard deviation. The most common measure of aspread for a sample.Sy St, St ( yi y ) 2n 1

Mathematical Background (cont’d) Variance. Representation of spread by the square ofthe standard deviation.S 2y2(y y) iorn 1S2yy y 22ii/nn 1 Coefficient of variation. Has the utility to quantify thespread of data.c.v. Syy100%

Chapter 17Least Squares RegressionLinear RegressionFitting a straight line to a set of pairedobservations: (x1, y1), (x2, y2), ,(xn, yn).y a0 a1 x ea1 - slopea0 - intercepte - error, or residual, between the model andthe observations

Linear Regression: Residual

Linear Regression: QuestionHow to find a0 and a1 so that the error would beminimum?

Linear Regression: Criteria for a “Best” Fitnni 1i 1min ei ( yi a0 a1 xi )e1e2e1 -e2

Linear Regression: Criteria for a “Best” Fitnni 1i 1min ei yi a0 a1 xi

Linear Regression: Criteria for a “Best” Fitnmin max ei yi a0 a1 xi i 1

Linear Regression: Least Squares FitnnnS r e ( yi , measured yi , model) ( yi a0 a1 xi ) 2i 12imin S r 2i 1n2 eii 1i 1n ( yi a0 a1 xi )2i 1Yields a unique line for a given set of data.

Linear Regression: Least Squares Fitmin S r n2 eii 1n ( yi a0 a1 xi )2i 1The coefficients a0 and a1 that minimize Sr must satisfythe following conditions: S r a 0 0 S r 0 a1

Linear Regression:Determination of ao and a1 S r 2 ( yi ao a1 xi ) 0 ao S r 2 ( yi ao a1 xi ) xi 0 a10 yi a 0 a1 xi0 yi xi a 0 xi a1 xi2 a0 na0na0 xi a1 yi2yx ax ax ii 0i 1i2 equations with 2unknowns, can be solvedsimultaneously

Linear Regression:Determination of ao and a1a1 n xi yi xi yin x xi 2ia0 y a1 x2

Data spread around MeanData spread around best-fit line18

Examples of linear regression with (a) small and(b) large residual errors19

Error Quantification of Linear Regression Total sum of the squares around the mean forthe dependent variable, y, is StS t ( yi y ) 2 Sum of the squares of residuals around theregression line is Srnni 1i 1S r ei2 ( yi ao a1 xi ) 2

Error Quantification of Linear Regression St-Sr quantifies the improvement or errorreduction due to describing data in terms of astraight line rather than as an average value.St S rr St2r2: coefficient of determinationr : correlation coefficient

Error Quantification of Linear RegressionFor a perfect fit: Sr 0 and r r2 1, signifying that the lineexplains 100 percent of the variability of thedata. For r r2 0, Sr St, the fit represents noimprovement.

Least Squares Fit of a Straight Line:ExampleFit a straight line to the x and y values in thefollowing 56636362875.52824x 49 47119.514038.5 xi 28 yi 24.02x i 14028x 47 xi yi 119 .524y 3.4285724y 3.4285717

Least Squares Fit of a Straight Line: Example(cont’d)a1 n xi yi xi yin x ( xi )2i27 119.5 28 24 0.839285727 140 28a0 y a1 x 3.428571 0.8392857 4 0.07142857Y 0.07142857 0.8392857 x

Least Squares Fit of a Straight Line:Example (Error Analysis)xi1234567yi0.52.52.04.03.56.05.5(yi 2.9911e i2Y 0.07142857 0.8392857 xnnS r e ( yi ao a1 xi ) 2i 12ii 1St yi y 22.71432Sr ei 2.99112St S rr 0.868St2r r 2 0.868 0.932

Least Squares Fit of a Straight Line:Example (Error Analysis) The standard deviation (quantifies the spread around the mean):sy St22.7143 1.9457n 17 1 The standard error of estimate (quantifies the spread around theregression line)sy / xSr2.9911 0.7735n 27 2Because S y / x S y , the linear regression model has good fitness

Algorithm for linear regression

Linearization of Nonlinear Relationships The relationship between the dependent andindependent variables is linear. However, few types of nonlinear functions canbe transformed into linear regressionproblems. The exponential equation. The power equation. The saturation-growth-rate equation.

TheexponentialequationThe powerequationSaturationgrowth-rateequation

Linearization of Nonlinear Relationships1. The exponential equation.y a1eb1x ln y ln a1 b1 x

Linearization of Nonlinear Relationships2. The power equationy a2 x b2 log y log a2 b2 log x

Linearization of Nonlinear Relationships3. The saturation-growth-rate equationy a3xb3 x 1 1 b3 1 y a3 a3 x

ExampleFit the following Equation:y a2 x b2to the data in the following table:log y log(a2 x b2 )xi1234515log y log a2 b2 log xyi0.51.73.45.78.419.7X logxi Y 222.0792.141let Y log y, X log x,a0 log a2 , a1 b2Y a0 a1 X

ExampleSumXiYiX*i Log(X)Y*i Log(Y)X*Y*X* 2.0792.1411.424 n x i y i x i y i 5 1.424 2.079 2.141a 1.75 12225 1.169 2.079n x i ( x i ) a0 y a1x 0.4282 1.75 0.41584 0.3341.169

Linearization of NonlinearFunctions: Examplelog y -0.334 1.75log xy 0.46x1.75

Polynomial Regression Some engineering data is poorly representedby a straight line. For these cases a curve is better suited to fitthe data. The least squares method can readily beextended to fit the data to higher orderpolynomials.

Polynomial Regression (cont’d)A parabola is preferable

Polynomial Regression (cont’d) A 2nd order polynomial (quadratic) is defined by:y ao a1 x a2 x e2 The residuals between the model and the data:ei yi ao a1 xi a2 xi2 The sum of squares of the residual: S r ei yi ao a1 xi a2 xi2 2 2

Polynomial Regression (cont’d) S r 2 ( yi ao a1 xi a2 xi2 ) 0 ao S r 2 ( yi ao a1 xi a2 xi2 ) xi 0 a1 S r 2 ( yi ao a1 xi a2 xi2 ) xi2 0 a22y n a ax ax io1 i2 i23xy ax ax ax i i o i 1 i 2 i2234xy ax ax ax i i o i 1 i 2 i3 linear equationswith 3 unknowns(ao,a1,a2), can besolved

Polynomial Regression (cont’d) A system of 3x3 equations needs to be solved to determinethe coefficients of the polynomial. n xi xi2 x x x x x xi2i3i2i3i4i a0 y i a1 xi yi a2 xi2 yi The standard error & the coefficient of determinationsy / xSr n 3St S rr St2

Polynomial Regression (cont’d)General:The mth-order polynomial:y ao a1 x a2 x 2 . am x m e A system of (m 1)x(m 1) linear equations must be solved fordetermining the coefficients of the mth-order polynomial. The standard error:sy / x Srn m 1 The coefficient of determination:St S rr St2

Polynomial Regression- ExampleFit a second order polynomial to .6654.4561.125125625305.5 1527.515152.655225979585.62489 x 15i yi 152.62x i 55 x3i 2254x i 979 x yix 15 2.5,6y 152.6 25.4336i 585.6 xi yi 2488.82

Polynomial Regression- Example (cont’d) The system of simultaneous linear equations: 6 15 55 a0 152.6 15 55 225 a 585.6 1 55 225 979 a2 2488.8 n xi xi2 x x x x x xi2i3i2i3i4i a0 y i a xy 1 i i a2 xi2 yi a0 2.47857, a1 2.35929, a2 1.86071y 2.47857 2.35929 x 1.86071x2

Polynomial Regression- Example (cont’d)xiyiymodele i2(yi-y 1272.1348915152.63.746572513.39333 The standard error of estimate:sy / x3.74657 1.126 3 The coefficient of determination:St yi y 2513.392 S r ei yi ao a1 xi a2 xi2Sr ei 3.7465722513.39 3.74657r 0.99851, r r 2 0.999252513.392 2 2

Nonlinear Regression Consider the previous exponential regression:y f ( xi ) a o ( 1 e a1 x) The sum of the squares of the residuals:nn S r e yi ao (1 e2ii 1i 1 2 a1 x in) yi f ( xi ) i 1 The criterion for least squares regression is: S r 0 & a o S r 0 a12

Nonlinear Regressiony f ( xi ) a o ( 1 en a1 x 2)n2S r yi ao (1 e a1 xi ) yi f ( xi ) i 1i 1 S r 0 & a o S r 0 a1 f ( xi ) S r 0 2 yi f ( xi ) aoi 1 ao n f ( xi ) S r 0 2 yi f ( xi ) a1i 1 a1 n

Nonlinear Regression f ( xi ) 0 yi f ( xi ) i 1 ao n f ( xi ) 0 yi f ( xi ) i 1 a1 n The partial derivatives are expressed at everydata point (i) in terms of ao and a1.Thus, the above leads to 2 equations in 2unknowns which can be solved iteratively for aoand a1.

Part 5 - CURVE FITTING Describes techniques to fit curves (curve fitting) to discrete data to obtain intermediate estimates. There are two general approaches for curve fitting: Least Squares regression: Data exhibit a significant degree of scatter. The strategy is to derive a single curve that represents the general trend of the data .

Related Documents:

behringer ultra-curve pro dsp 24 a/d- d/a dsp ultra-curve pro ultra- curve pro 1.1 behringer ultra-curve pro 24 ad/da 24 dsp ultra-curve pro dsp8024 smd (surface mounted device) iso9000 ultra-curve pro 1.2 ultra-curve pro ultra-curve pro 19 2u 10 ultra-curve pro ultra-curve pro iec . 7 ultra-curve pro dsp8024 .

polynomial curve fitting. Polynomials are one of the most The Polynomial Curve Fitting uses the method of least squares when fitting data. The fitting process requires a model that relates the response data to the predictor data with one or more coefficients. The result of the fitting process is an estimate of

I. METHODS OF POLYNOMIAL CURVE-FITTING 1 By Use of Linear Equations By the Formula of Lagrange By Newton's Formula Curve Fitting by Spiine Functions I I. METHOD OF LEAST SQUARES 24 Polynomials of Least Squares Least Squares Polynomial Approximation with Restra i nts III. A METHOD OF SURFACE FITTING 37 Bicubic Spline Functions

Keywords Curve fitting · Surface fitting · Discrete polynomial curve · Discrete polynomial surface · Local optimal · Outliers 1 Introduction . The method of least squares is most commonly used for model fitting. This method estimates model parameters by minimizing the sum of squared residuals from all data, where

For best fitting theory curve (red curve) P(y1,.yN;a) becomes maximum! Use logarithm of product, get a sum and maximize sum: ln 2 ( ; ) 2 1 ln ( ,., ; ) 1 1 2 1 i N N i i i N y f x a P y y a OR minimize χ2with: Principle of least squares!!! Curve fitting - Least squares Principle of least squares!!! (Χ2 minimization)

regression curve fitting a. You will have to estimate your parameters from your curve to have starting values for your curve fitting function 3. Once you have parameters for your curves compare models with AIC 4. Plot the model with the lowest AIC on your point data to visualize fit Non-linear regression curve fitting in R:

Curve fitting by method of least squares for parabola Y aX2 bX c ƩY i aƩX i 2 bƩX i nc ƩX i Y i aƩX i 3 bƩX i 2 cƩX i ƩX i 2Y i aƩX i 4 bƩX i 3 cƩX i 2 P.P.Krishnaraj RSET. Curve fitting by method of least squares for exponential curve Y aebX Taking log on both sides log 10 Y log 10 a bXlog 10 e Y A BX ƩY i nA BƩX i ƩX i Y i AƩX

Reading Comprehension Practice Test . 1. Questions 1-7. In the sixteenth century, an age of great marine and terrestrial exploration, Ferdinand Magellan led the first expedition to sail around the world. As a young Portuguese noble, he served the king of Portugal, but he became involved in the quagmire of political intrigue at court and lost the king's favor. After he was dismissed from .