Univariate Kernel Density Estimation

2y ago
13 Views
2 Downloads
361.96 KB
16 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Victor Nelms
Transcription

Univariate kernel density estimationBen JannETH Zürich, Switzerland, jann@soz.gess.ethz.chAugust 3, 2007Abstract. The methods and formulas used by the kdens package for Stata 9.2 arediscussed in this document. To install kdens, type ssc install kdens in Stata.Note that the moremata package is required (type ssc install moremata).Preliminary note: In addition to the methods presented below, kdens also supportsconfidence interval and variance estimation using bootstrap and jackknife techniques.The implementation of the bootstrap and jackknife for density estimation is straightforward and is therefore not discussed here.1Standard kernel density estimationLet X1 , . . . , Xn be a sample from X, where X has the probability density functionf (x). Furthermore, let w1 , . . . , wn be associated weights (set wi 1 i if there are noweights). The density of X can then be estimated as: n1 X wix XibfK (x; h) K(1)W i 1 hhPnwhere W i 1 wi and K(z) is a kernel function (see Section 9). h is the smoothingparameter (the kernel halfwidth or “bandwidth”). Formula (1) is also used, for example,by official Stata’s kdensity (see [R] kdensity).2Adaptive kernel density estimationThe adaptive kernel density estimator is defined as n1 X wix XiafbK(x; h) KW i 1 hλihλi(2)where λ, the local bandwidth factors, are based on a preliminary fixed bandwidth densityestimate. The factors are estimated asv uu G fbK (X; h)bi t, i 1, . . . , n(3)λfbK (Xi ; h)where G() stands for the geometric mean over all i. Note that G(λ) 1 and thusG(hλ) h. The estimator is based on Abramson (1982). Also see, for example,

2Univariate kernel density estimationSilverman (1986, 100–110), Fox (1990, 100–103), Salgado-Ugarte et al. (1993), SalgadoUgarte and Pérez-Hernández (2003), or Van Kerm (2003).Technical note: fbK (Xi ; h) is determined by linear interpolation if Xi falls betweenthe points at which the preliminary density estimate has been evaluated.3Approximate variance estimationAn approximate estimator for the variance of fbK (x; h) at point x is given as11Ve {fbK (x; h)} R(K)fbK (x; h) fbK (x; h)2nhn(4)whereZ K(z)2 dzR(K) (for theoretical background see, e.g., Scott 1992, 130). A simple extension of (4) to theadaptive kernel density method isaVe {fbK(x; h)} 1 a1aR(K)fbK(x; h) fbK(x; h)2nhλ(x)n(5)where λ(x) denotes the bandwidth factor at point x. However, note that (5) understatesthe true variance since the local bandwidth factors are assumed fixed.Probability weights can be taken into account by adding a penalty for the amountof variability in the weights distribution. In particular,PnVe {fbK (x; h)} 2i 1 wiW2 1R(K)fbK (x; h) fbK (x; h)2h (6)where W is the sum of weights as defined above (see Van Kerm 2003 and Burkhauseret al. 1999 for a similar approach). The assumption behind this formula, however, isthat the weights w are essentially independent from X, an assertion that may not beappropriate.1Note that (4) differs from the standard variance formula often found in the literature.The standard formula only contains the first term (see, e.g., Van Kerm 2003, Silverman1986, 40, Härdle et al. 2004) and, although the second term asymptotically disappears,has a quite substantial bias in finite samples. Estimator (4) is more accurate than thestandard formula in finite samples.1. A point could also be made that weights should be omitted from estimation entirely if the assertionof independence is, in fact, true.

Ben Jann43Exact variance estimationThe variance of fbK (x; h) can be written as12E [Kh (x X) E{Kh (x X)}]n 1 E{Kh (x X)2 } E{Kh (x X)}2nV {fbK (x; h)} (7)where Kh (z) 1/h K(z/h) andZ Kh (x y)f (y) dyE{Kh (x X)} E{fbK (x; h)} A natural estimators for (7) is1Vb {fbK (x; h)} n( 2nx Xi1 X wiK fbK (x; h)2W i 1 h2h)or, in the case of the adaptive method,() 2n1x Xi1 X wiaa2bbbV {fK (x; h)} K fK (x; h)n W i 1 (hλi )2hλi(8)(9)where w is assumed to represent frequency weights or analytic weights (for similarformulas see, e.g., Hall 1992 and Fiorio 2004). Again note that (9) will be downwardbiased because the local bandwidth factors are assumed fixed.If the weights are sampling weights, an estimator for (7) may be derived as 2nx Xi1 X 2 1bbbwKV {fK (x; h)} 2 fK (x; h)W i 1 i hh(10)In many cases the approximate estimator is quite good and using the exact formulais not worth the extra computational effort. However, the exact formula should be usedif the data contain sampling weights. Furthermore, note that both the exact and theapproximate variance formulas assume h fixed. Data dependent choice of h, however,may result in additional variability of the density estimate, especially in regions withhigh curvature.5Confidence intervalsPointwise confidence intervals for f (x) are constructed asqfbK (x; h) z1 α/2 Vb {fbK (x; h)}(11)

4Univariate kernel density estimationwhere z1 α/2 is the (1 α/2) quantile of the standard normal distribution.Confidence intervals such as (11) may have bad coverage due to the well-known biasin fbK . One suggestion to improve coverage is to use an undersmoothed density estimateto construct the confidence intervals (Hall 1992, Fiorio 2004). The implementation ofthis approach in kdens usesn1/5hus h τnwhere τ is the undersmoothing parameter. τ should be larger than 1/5; τ 1/4 is thedefault choice in kdens once undersmoothing is requested.6Density estimation for bounded variablesTwo simple methods to estimate the density of variables with bounded domain are therenormalization method and the reflection method. Both methods produce consistentestimates, but have a bias of order h near the boundaries (compared to the usual biasof order h2 in the interior). Various more advanced correction techniques with biasof order h2 at the boundaries have been proposed (see, e.g., Jones and Foster 1996or Karunamuni and Alberts 2005). One of these methods is the linear combinationtechnique discussed in Jones (1993).6.1RenormalizationThe most natural way to deal with the boundary problem is to use a standard kerneldensity estimate and then locally rescale it relative to the amount of local kernel massthat lies within the support of X (see, e.g., Jones 1993, 137). Let L be the lowerboundary of the support of X (e.g. L 0) and U be the upper boundary (e.g. U 1).Furthermore, letZua0 (l, u) K(y) dylThe renormalization version of the standard estimator then is1n fbK (x; h)fbK(x; h, L, U ) U xa0 L xh , h(12)for x [L, U ]. Furthermore, the boundary renormalization analogue to the adaptiveestimator in (2) can be written as n1 X1wix Xin,ab fK (x; h, L, U ) K(13)W i 1 a L x , U x hλihλi0 hλihλinThe approximate variance estimator for fbKis L x U x1 b h , h1 nnn2Ve {fbK(x; h, L, U )} fb (x; h, L, U ) fbK (x; h, L, U )nh a0 L x , U x 2 Knhh(14)

Ben Jann5whereZb(l, u) uK(y)2 dyl(see Jones 1993). The exact variance estimator can be written asnVb {fbK(x; h, L, U )} 1b {fbK (x; h)} VU x 2,a0 L xhh(15)in the most simple case. The renormalization variance estimators for the adaptive kernelmethod and for data containing sampling weights can also be easily derived, but theresulting formulas are more complicated.6.2ReflectionThe reflection estimator approaches the boundary problem by “reflecting” the dataat the boundaries (see, e.g., Silverman 1986, 30, Ćwik and Mielniczuk 1993). In thestandard case the reflection estimator is given asn1 X wi rrK (x; Xi , h, L, U )fbK(x; h, L, U ) W i 1 h(16)for x [L, U ], whereK r (x; X, h, L, U ) K x Xh Kx X 2Lh Kx X 2Uh The reflection technique, that is, replacing K with K r , can be applied analogouslyto the adaptive estimator and also the exact variance estimators are easily derived.Unfortunately, however, the reflection solution for the approximate variance estimatoris more complex and not supported by kdens.6.3Linear CombinationLet z (x X)/h, l (L x)/h, and u (U x)/h. The linear combination techniquethen replaces K(z) byK lc (z; l, u) whereZa2 (l, u) a1 ( u, l)zK(z)a2 (l, u)a0 (l, u) (a1 ( u, l))2 la1 ( u, l) ZyK(y) dy, ua2 (l, u) (17)uy 2 K(y) dylSimilar to the reflection technique, exact variance estimates can be obtained by simply plugging K lc into the standard formulas. Approximate variance estimation is notsupported.

6Univariate kernel density estimation7Binned estimation7.1DensityKernel density estimators such as (1) involve lots of computations. One solution toreduce processing time is to estimate the density based on binned data. The binnedkernel density estimator is defined as m1 X c gj g efK (gj ; h) K,Whhj 1, . . . , m(18) 1where g1 , . . . , gm is a grid of m Pequally spaced evaluation points and c1 , . . . , cm arethe associated grid counts withc W . Given the equal spacing of grid points,equation (18) has a discrete convolution structure and can be calculated using fastFourier transform (see Wand and Jones 1995, 182–188, for details). This makes theestimator very fast. The grid counts are computed using linear binning as follows.Let g and g be the two nearest grid points below and above observation Xi . Thenwi (g Xi )/(g g ) is added to the grid count at g and wi (Xi g )/(g g )is added to the count at g . Note that the results from (18) are usually quite accurateeven for relatively small m. The rule-of-thumb given by Hall and Wand (1996, 182) isthat ,,grid sizes of about 400–500 are adequate for a wide range of practical situations“.The binned version of the adaptive kernel density estimator is m1 X c gj g aefK (gj ; h) K,Whλ hλ j 1, . . . , m(19) 1where λ denotes the local bandwidth factors. Unfortunately, the computational shortcutused for (18) is not applicable to (19).7.2VarianceVariance estimation is straightforward with binned data. For example,11Ve {feK (gj ; h)} R(K)feK (gj ; h) feK (gj ; h)2nhn(20)If sampling weights are applied, a reasonable variance formula for the binned estimatoris 2m1 X1gj g Vb {feK (gj ; h)} 2K feK (gj ; h)(21)c (w2 )Whh 1with c(w2 ) representing linearly binned squared weights.

Ben Jann7.37Density derivatives and density functionalsAdvanced automatic bandwidth selection involves estimating density functionals of theformZZR(r) R(f (r) ) f (r) (z)2 dz ( 1)r f (2r) (z)f (z) dzwhere f (r) denotes the rth derivative of f . A binned approximation estimator employingthe gaussian kernel can be written asmX(2r)e(r) (m, h) ( 1)r 1cj feφ (gj ; h)RφW j 1where m1 X c (r) gj g (r)eφfφ (gj ; h) ,Whr 1hj 1, . . . , m(22)(23) 1and φ(r) denotes the rth derivative of φ, the standard normal density. Equation (23)can be solved as the convolution of fast Fourier transforms.7.4Bounded variablesBinned versions of the estimators for bounded variables are usually simple to derive.For example, the binned renormalization density estimator in the standard case isnfeK(gj ; h, L, U ) 1IK L gj U gjh ,h feK (gj ; h),j 1, . . . , m(24)for gj [L, U ].An exception is the estimation of density derivatives and density functionals wherethe renormalization and the linear combination methods have no easy solutions. Fortunately, the reflection technique is a valuable alternative in this situation.8Data-dependent bandwidth selectionIt can be shown from asymptotic theory that the bandwidth 15R(K)hopt 2 }2 R(f 00 )n{σKwhere2σK Zz 2 K(z) dz,ZR(K) (25){K(z)}2 dzis “optimal” in the sense that it minimizes the asymptotic mean integrated squared2error (AMISE). Note that σK, the kernel variance, and R(K), the kernel “roughness”,are known properties of the chosen kernel function. However, R(f 00 ), where f 00 denotesthe second derivative of f , is unknown.

88.1Univariate kernel density estimationQuick and simple rulesNormal scale ruleOne idea to derive a first guess for hopt is to assume a specific functional form for thedensity and then solve equation (25). If the density is assumed to be normal, the optimalh can be estimated as 1518 πb(26)σb n 5hN δ K3where σb is an estimate of scale and δK is the “canonical bandwidth” of the chosen kernelfunction, that is 1/5R(K)δK 2 }2{σK(see, e.g., Scott 1992, 141–143; Härdle et al. 2004). Usually, σb min(sx , IQRx /1.349)is used where sx is the standard deviation and IQRx is the inter-quantile range of theobserved data.Oversmoothed bandwidth ruleIt can be shown that, given the scale parameter σ, hopt has a simple upper bound (see,e.g., Salgado-Ugarte et al. 1995). This upper bound can be estimated as 11243 5bhO δ Ksx n 535(27)Note that hO ' 1.08hN . While bhO usually is too large and results in a density estimatethat is too smooth, it is a good starting point for subjective choice of bandwidth. Infact, it may be convenient to choose the bandwidth as a fractional of bhO , for example0.8bhO or 0.5bhO .Optimal of SilvermanBased on simulations studies, Silverman (1986, 45–48) suggested using1bhSφ 0.9 σ̂n 5(28)for the gaussian kernel, which translates to1bhS 1.159 δK σ̂n 5(29)in the general case. bhS is used as the default bandwidth estimate in kdens. OfficialStata’s kdensity uses bhSφ .

Ben Jann8.29Sheather-Jones plug-in estimatorThe implementation of the Sheather-Jones plug-in estimator bhSJPI closely follows Sheatherand Jones (1991) using a gaussian kernel. Bounded variables are taken into account using the reflection technique. Note that kdens imposes an upper limit for bhSJPI . Thislimit is bhO .8.3Direct plug-in estimatorThe implementation of the direct plug-in estimator bhDPI in kdens closely follows Wandand Jones (1995, 71–74) (also see Wand and Jones 1995, 177–189) using a gaussiankernel. Bounded variables are taken into account using the reflection technique.8.4Probability weightsProbability weights inflate the variance of the density estimate and formula (25) isinappropriate. A rough correction is to apply a penalty for the variability of the weightsto a standard bandwidth estimate (also see formula 6), i.e. Pn 1n i 1 wi2 5 bwb·hh W2where bh is computed by one of the above methods. kdens applies this correction ifpweights are specified.9Kernel functionsVarious kernels are supported by kdens. Note that the actual kernel functions are notincluded in the kdens package; they are provided by the moremata package. Tables 1–6give an overview of the various kernels and their properties.10ReferencesAbramson, I. S. 1982. On Bandwidth Variation in Kernel Estimates. A Square RootLaw. The Annals of Statistics 10(4): 1217–1223.Burkhauser, R. V., A. C. Cutts, M. C. Daly, and S. P. Jenkins. 1999. Testing theSignificance of Income Distribution Changes over the 1980s Business Cycle: A CrossNational Comparison. Journal of Applied Econometrics 14(3): 253–272.Ćwik, J. and J. Mielniczuk. 1993. Data-dependent bandwidth choice for a grade densitykernel estimate. Statistics & Probability Letters 16: 397–405.Fiorio, C. V. 2004. Confidence intervals for kernel density estimation. The Stata Journal4(2): 168–179.

10Univariate kernel density estimationFox, J. 1990. Describing Univariate Distributions. In Modern Methods of Data Analysis,eds. J. Fox and J. S. Long, 58–123. Newbury Park, CA: Sage.Hall, P. 1992. Effect of Bias Estimation on Coverage Accuracy of Bootstrap ConfidenceIntervals for a Probability Density. The Annals of Statistics 20(2): 675–694.Hall, P. and M. P. Wand. 1996. On the Accuracy of Binned Kernel Density Estimators.Journal of Multivariate Analysis 56: 165–184.Härdle, W., M. Müller, S. Sperlich, and A. Werwatz. 2004. Nonparametric and Semiparametric Models. An Introduction. Berlin: Springer.Jones, M. C. 1993. Simpel boundary correction for kernel density estimation. Statisticsand Computing 3: 135–146.Jones, M. C. and P. J. Foster. 1996. A Simple Nonnegative Boundary Correction Methodfor Kernel Density Estimation. Statistica Sinica 6: 1005–1013.Karunamuni, R. J. and T. Alberts. 2005. On boundary correction in kernel densityestimation. Statistical Methodology 2: 191–212.Salgado-Ugarte, I. H. and M. A. Pérez-Hernández. 2003. Exploring the use of variablebandwidth kernel density estimators. The Stata Journal 3(2): 133–147.Salgado-Ugarte, I. H., M. Shimizu, and T. Taniuchi. 1993. snp6: Exploring the shapeof univariate data using kernel density estimators. Stata Technical Bulletin 16: 8–19.—. 1995. snp6.2: Practical rules for bandwidth selection in univariate density estimation. Stata Technical Bulletin 26: 23–31.Scott, D. W. 1992. Multivariate Density Estimation. Theory, Practice, and Visualization. New York: Wiley.Sheather, S. J. and M. C. Jones. 1991. A Reliable Data-Based Bandwidth SelectionMethod for Kernel Density Estimation. Journal of the Royal Statistical Society.Series B (Methodological) 53(3): 683–690.Silverman, B. W. 1986. Density Estimation for Statistics and Data Analysis. London:Chapman and Hall.Van Kerm, P. 2003. Adaptive kernel density estimation. The Stata Journal 3(2): 148–156.Wand, M. P. and M. C. Jones. 1995. Kernel Smoothing. London: Chapman and Hall.

Ben Jann11Table 1: Kernel functions (the kernel functions evaluate to zero if z is outside theindicated support)Kernel Epanechnikov K(z) 34 (1 15 z 2 )/ 5if z 5K(z) 34 (1 z 2 )Epan21516 (13532 (1if z 1 z 2 )2if z 1 z 2 )3if z 1BiweightK(z) TriweightK(z) CosineK(z) 1 cos(2πz)if z GaussianK(z) φ(z)(8(1 z )3 /3K(z) 4233 8z 8 z if 12 z 1if z 12Parzen12if z 1RectangularK(z) TriangularK(z) 1 z if z 1biweight0 .2 .4 .6 .8 100.2 .4 .6 .8epan2.1 .2 .3 .4epanechnikov 2 1012 1 .50.511 .50.50.511 1012triangle0 .2 .4 .6 .8 10 .1 .2 .3 .4 .51.50 .5 2rectangle1.5parzen 1.50.50.50gaussian1 1.5 21.50 .5cosine0 .5 1.1 .2 .3 .4triweight 112 1 .50.51 1 .50.51

12Univariate kernel density estimationTable 2: Kernel propertiesKernelEpanechnikov2σKR(K)3 5 enRectangularTriangular35042932112 11212131151513515.9939.986714312π 2 6(1/6 1/π 2 )214π1163 5 5Efficiency 15 19450 5191 2 π30231523δK 21 10 1151 53592 152415 15.9897.9512.9695.9295.9859

Ben Jann13Table 3: Kernel integrals (the integrals evaluate to 0 if z is below the kernel supportand 1 if above)RzKernelK(y)dy 131 3Epanechnikov 2 4 (z 15z )/ 5if z 5Epan212Biweight12 1516 (zTriweight12 Cosine12 353 5332 (z z 5 zz sin(2πz)2πGaussianΦ(z) 2 3 1 2 12 13 Parzen1Rectangular2(Triangular 34 (z 31 z 3 ) 2 33zif z 1 1 55z )if z 1 17 z 7 )if z 83 233 (z 2 z z 48 343 z 3 z 2z48 343 z 3 z 2z83 233 (z 2 z z 1 44z )12121 44z )if z 1 z z 1 22z1 22zif 1 z 0if 0 z 10120 .2 .4 .6 .8 1biweight0 .2 .4 .6 .8 1epan20 .2 .4 .6 .8 1 1 1 .50.510.510.5.51.51 10120 .2 .4 .6 .8 1triangle0 .2 .4 .6 .8 10 2rectangle0 .2 .4 .6 .8 1 .500 .2 .4 .6 .8 1 .5parzen 1 .5gaussian0 .2 .4 .6 .8 1 .5 1cosine0 .2 .4 .6 .8 1triweight 112if 1 z 21if 12 z 0if 0 z 12if 12 z 1 12 zepanechnikov 2if z 1 1 .50.51 1 .50.51

14Univariate kernel density estimationTable 4: Kernel squared integrals (the integrals evaluate to 0 if z is below the kernelsupport and R(K) if above)RzKernelK(y)2 dy 92 31 5Epanechnikov 103 5 80(z 15z 125z )if z n1 Φ( 2z)2 π 64641 723456 63 9 (z 3z 5z 5z 3z z 7 z ) 151 16 (z 4z 3 3z 4 36 z 5 12z 6 36 z 7 )3159571636 536 7346 151 (z 4z 3z z 12z 57 z ) 3152 964 35 9 (z 3z 2 5z 3 5z 4 3z 5 z 6 71 z 7 )114 4z(1 3123 z z 3z1 3123 z z 3zParzenRectangularTriangular 916 (z 23 z 3 15 z 5 ) 225256 (z 4 33z 6 55zif z 1 4 77z 1 99z )122520 75 9351024 (z 2z 3z 7 z 3 zsin(2πz)3 cos(2πz)4πsin(2πz)2z π 2 .50.5111if 1 z 0if 0 z 10.51.2.10 .50.5 2 1012triangle0.2 .4 .6 .8rectangle.5if z 1.31.5.50.50 .1 .2 .3 .4 .50 .5 1 .50.51 1 .5012if 1 z 12if 21 z 0if 0 z 12if 12 z 1gaussian1.2 .4 .6 .80 1cosine0 .2 .4 .6 .8 1 .5if z 0 1parzen 1if z 1.2 .4 .6 .8.6.2010 .51 1313 z ) biweight.4.3.2.10triweight 16 1111 zepan20 1 epanechnikov 2if z 1.51

Ben Jann15Table 5: Integrals over yK(y) (the integrals evaluate to 0 if z is outside the kernelsupport)RzKernelyK(y) dy 3 53Epanechnikov 16 8 z 2 803 5 z 4if z 55Epan23 16 38 z 2 Biweight5 32Triweight35 256Cosine 81Gaussian 1 exp( 12 z 2 ) 2π2 15 43 z 2 83 z 3 2z 4 7 2 z 2 2z 4 8 z 5603572 28 54 60 z 2z 35z 24 28 3 15 3 z 3 z 2z 4 Parzen15 232 z 35 264 z1 22z3 416 z 15 432 zif z 1 5 632 zif z 1105 435 635 8128 z 64 z 256 zsin(2πz)z cos(2πz) 4π1 22π4π 2 Triangular8 515 z8 515 zif z 1if 1 z 0if 0 z 112 .15 .1 .050biweight .2 .15 .1 .05 00 1 .50.510.510.5.51.51 1012 .2 .15 .1 .05 0triangle .25 .2 .15 .1 .05 000 2rectangle .15 .1 .05 .50 .4 .3 .2 .1 0 .5parzen 1 .5gaussian .08 .06 .04 .02 0 .15 .1 .05 .5 1cosine0triweight 112if 1 z 21if 12 z 0if 0 z 21if 12 z 1epan2 .4 .3 .2 .1 0epanechnikov 1if z 41 14 z 2( 61 12 z 2 13 z 3 16 12 z 2 13 z 3Rectangular 2if z 1 1 .50.51 1 .50.51

16Univariate kernel density estimationTable 6: Integrals over y 2 K(y) (the integrals evaluate to 0 if z is below the kernel2support and σKif above)RzKernely 2 K(y) dy 3 513z zif z Epanechnikov 21 4 55100 5 14 z 3 3 520 zEpan2110Biweight114Triweight118Cosine124Gaussian 12π z exp( 12 z 2 ) Φ(z) 28 38 54 64 45 9 z 2z 5 z 9 z 1 4 z3 8 z5 4 z62495314 38 54 6 24 z z z 7 98 3 5 4 38 5 4 6180 9 z 2z 5 z 9 zParzenRectangular16(Triangular 5 316 z 35 396 z 1 33z3 58zif z 1 15 7112 zif z 121 515 735 932 z 32 z 288 zsin(2πz) 2sin(2πz)z cos(2πz)2π2π 2 z 4π 3 112112 1 33z1 33z if 1 z 0if 0 z 1biweight.15epan2120.05.10 .05 .1 .15 .20 1 .50.51.510.511 10120 .05 .1 .15 .2.1 .2 .3 .4.5.5triangle00 2rectangle0 .02 .04 .06 .08 .500 .2 .4 .6 .8 1 .5parzen 1 .5gaussian0 .01 .02 .03 .04.1.050 1cosine0 .512if 1 z 21if 21 z 0if 0 z 21if 12 z 11 44z1 44ztriweight 1if z if z 10 .2 .4 .6 .8 1 1 16 z 3epanechnikov 2if z 114π 2 1 .50.51 1 .50.51

i and K(z) is a kernel function (see Section 9). h is the smoothing parameter (the kernel halfwidth or “bandwidth”). Formula (1) is also used, for example, by official Stata’s kdensity (see [R] kdensity). 2 Adaptive kernel density estimation The adaptive kernel density estimator is defined as fba K (x;h) 1 W Xn i 1 w i hλ i K x X .

Related Documents:

Kernel density estimation in R Kernel density estimation can be done in R using the density() function in R. The de

Anatomy of a linux kernel development Questions : – How to work kernel code? – How to write C code on the kernel? – How to building and install the kernel on old version linux? – How to release the linux kernel? – How to fixes bugs (patch) on kernel trees? Goal : –

What if Linux Kernel Panics Kexec: system call to load and boot into another kernel from the currently running kernel (4.9.74). crashkernel 128M [normal kernel cmdline] irqpoll, nosmp, reset_devices [crash kernel cmdline] --load-panic option Kdump: Linux mechanism to dump machine memory content on kernel panic.

Kernel Boot Command-Line Parameter Reference The majority of this chapter is based on the in-kernel documentation for the ichwerewrittenbythe kernel developers and released under the GPL. There are three ways to pass options to the kernel and thus control its behavior: When building the kernel.

n Linux is a modular, UNIX -like monolithic kernel. n Kernel is the heart of the OS that executes with special hardware permission (kernel mode). n "Core kernel" provides framework, data structures, support for drivers, modules, subsystems. n Architecture dependent source sub -trees live in /arch. CS591 (Spring 2001) Booting and Kernel .

Univariate Seasonal ARIMA Modeling A univariate time series model which also has components of seasonality may be concisely expressed as: (p,d,q)(P ,D,Q) transformation where: I p is the number of regular autoregressive . trigonometric function of sine and cosine components. Spectral analysis allows the

2) To run repeated measure analysis with univariate approach, click through the following sequence of SPSS options. Analyze/General Linear Model/ Univariate Set up of the

Artificial intelligence (AI) is reshaping business, economy, and society by transforming experiences and relationships among st stakeholders and citizens. The roots of AI may lie in ancient cultures of Greek (e.g., the mythological robot Talos), Chinese (e.g., Yueying Huang’ dogs) and other mythologies (Nahodil & Vitku, 2013), where automatons were believed to be imbued with real minds .