Curve Fitting - Least Squares

1y ago
7 Views
3 Downloads
1.30 MB
8 Pages
Last View : 25d ago
Last Download : 3m ago
Upload by : Ryan Jay
Transcription

Curve fitting – Least squares1Curve fittingKm 100 µMvmax 1 ATP s-12

Excurse error of multiple measurementsStarting point: Measure N times the same parameterObtained values are Gaussian distributed around mean withstandard deviation σWhat is the error of the mean of all measurements?Sum x1 x2 . xNVariance of sum N * σ2Standard deviation of sum N Mean (x1 x2 . xN)/N Standard deviation of mean(called standard error of mean)(Central limit theorem)N NN N 1N3Excurse error propagationWhat is error for f(x,y,z, ) if we know errors of x,y,z, (σx, σy, σz, ) for purelystatistical errors?- Individual variances add scaled by squared partial derivatives (if parameters are uncorrelated)Examples :Addition/substraction:Product:squared errorsadd upsquared relativeerrors add up4

Excurse error propagationsquared relativeerrors add upRatios:Powers:relative errortimes powerLogarithms:error is relativeerror5Curve fitting – Least squaresStarting point: - data set with N pairs of (xi,yi)- xi known exactly,- yi Gaussian distributed around true value with error σi- errors uncorrelated- function f(x) which shall describe the values y (y f(x))- f(x) depends on one or more parameters av v max[S ][S ] K M6

Curve fitting – Least squaresProbability to get yi for given xiP ( y i ; a) v v max221e [ y i f ( xi ;a )] / 2 i ]2 i[S ][S ] K M7Curve fitting – Least squaresNProb. to get whole set yi for set of xiP ( y 1,., y N ; a ) i 1221e [ y i f ( xi ;a )] / 2 i ]2 iFor best fitting theory curve (red curve) P(y1,.yN;a) becomes maximum!v v max[S ][S ] K M8

Curve fitting – Least squaresNProb. to get whole set yi for set of xi221e [ y i f ( xi ;a )] / 2 i ]2 iP ( y 1,., y N ; a ) i 1For best fitting theory curve (red curve) P(y1,.yN;a) becomes maximum!Use logarithm of product, get a sum and maximize sum:2 N1 N y i f ( xi ; a) ln P ( y 1,., y N ; a ) ln i 2 2 1 i1 OR minimize χ2 with:Principle of least squares!!!9Curve fitting – Least squaresPrinciple of least squares!!!(Χ2 minimization)Solve:Solve equation(s) either analytically (only simple functions)ornumerically (specialized software, different algorithms)2χ value indicates goodness of fitErrors available:USE THEM! so called weighted fitErrors not available: σi’s are set as constant conventional fit10

Reduced χ2Expectation value of χ2 for weighted fit: y i f ( x i ; a) 2N 2 i2i 1 i2 N2 ii 1N Define reduced χ2: 2red 2 redN M2 red 1withnumber of fit paramtersFor weighted fit the reduced χ2 should become 1, if errors are properly chosenExpectation value of χ2 for unweighted fit: 21 N MN yi 1 f ( x i ; a ) 2i1 N MN 2 2i 1Should approach the variance of a single data point11Simple proportionDifferentiation:For σi const σSolve:Get:12

Simple proportion - ErrorsRewrite:- Error of m given by errors of yi- Use rules for error propagation- Standard error of determination of m (single confidence interval) is thensquare-root of V(m)‐ σ2 is estimated as mean square deviation of fitted function from the yvalues if now errors are available1 N2 y i f ( x i ; a ) N 1213Straight line fit (2 parameters)(or σi const σ)Differentiation w.r.t c:orDifferentiation w.r.t m:orGet:(solve eqn. array)Errors:For all relations which are linear with respect to the fit parameters, analyticalsolutions possible!14

Straight line fit (2 parameters)Additional quantities for multiple parameters: Covariances- describes interdependency/correlation between the obtained parametersVii Var(x(i))Covariance matrix Vp1p1 Vp1p 2 Vp1p 3 Var ( p1 ) Vp1p 2Vp1p 3 Var ( p2 ) Vp 2 p 3 cov pi , p j Vp1p 2 Vp 2 p 2 Vp 2 p 3 Vp1p 2 Vp 2 p 3Var ( p3 ) Vp1p 3 Vp 2 p 3 Vp 3 p 3 Vp1p 3Squared errors of fit parameters!Straight line fit:15More in depthOnline lecture: Statistical Methods of Data Analysis by Ian C. Brockhttp://www-zeus.physik.uni-bonn.de/ brock/teaching/stat ws0001/Numerical recipes in C , Cambridge university press16

For best fitting theory curve (red curve) P(y1,.yN;a) becomes maximum! Use logarithm of product, get a sum and maximize sum: ln 2 ( ; ) 2 1 ln ( ,., ; ) 1 1 2 1 i N N i i i N y f x a P y y a OR minimize χ2with: Principle of least squares!!! Curve fitting - Least squares Principle of least squares!!! (Χ2 minimization)

Related Documents:

Least Squares Fitting Least Square Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the offsets ("the residuals") of the points from the curve. The sum of the squares of the offsets is used instead of the offset absolute values because this allows the

I. METHODS OF POLYNOMIAL CURVE-FITTING 1 By Use of Linear Equations By the Formula of Lagrange By Newton's Formula Curve Fitting by Spiine Functions I I. METHOD OF LEAST SQUARES 24 Polynomials of Least Squares Least Squares Polynomial Approximation with Restra i nts III. A METHOD OF SURFACE FITTING 37 Bicubic Spline Functions

The process of constructing an approximate curve x which fit best to a given discrete set of points ,xyii in., is called curve fitting Principle of Least Squares: The principle of least squares (PLS) is one of the most popular methods for finding the curve of best fit to a given data set ,nii. Let be the equation of the curve to be fitted to .

Other documents using least-squares algorithms for tting points with curve or surface structures are avail-able at the website. The document for tting points with a torus is new to the website (as of August 2018). Least-Squares Fitting of Data with Polynomials Least-Squares Fitting of Data with B-Spline Curves

designing, controlling or planning. There are many principles of curve fitting: the Least Squares (of errors), the Least Absolute Errors, the Maximum Likelihood, the Generalized Method of Moments and so on. The principle of Least Squares (method of curve fitting) lies in minimizing the sum of squared errors, 2 2 1 n [ ( , )] i i i s y g x b

Part 5 - CURVE FITTING Describes techniques to fit curves (curve fitting) to discrete data to obtain intermediate estimates. There are two general approaches for curve fitting: Least Squares regression: Data exhibit a significant degree of scatter. The strategy is to derive a single curve that represents the general trend of the data .

Curve fitting by method of least squares for parabola Y aX2 bX c ƩY i aƩX i 2 bƩX i nc ƩX i Y i aƩX i 3 bƩX i 2 cƩX i ƩX i 2Y i aƩX i 4 bƩX i 3 cƩX i 2 P.P.Krishnaraj RSET. Curve fitting by method of least squares for exponential curve Y aebX Taking log on both sides log 10 Y log 10 a bXlog 10 e Y A BX ƩY i nA BƩX i ƩX i Y i AƩX

brother’s life ended in death by the hands of his brother. We are going to see what the Holy Spirit revealed that caused the one to murder his flesh and blood. We are also going to see God’s expectation and what he needed to operate in as his brother’s keeper. My desire is for us to all walk away with a greater burden for each other as we see each other as ourselves and uphold each other .