Techniques For finding The Distribution Of A Transformation .

2y ago
21 Views
2 Downloads
253.66 KB
18 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Nadine Tse
Transcription

TRANSFORMATIONS OF RANDOM VARIABLES1. I NTRODUCTION1.1. Definition. We are often interested in the probability distributions or densities of functions ofone or more random variables. Suppose we have a set of random variables, X1, X2 , X3 , . . . Xn , witha known joint probability and/or density function. We may want to know the distribution of somefunction of these random variables Y φ(X1, X2, X3, . . . Xn). Realized values of y will be related torealized values of the X’s as followsy Φ (x1, x2, x3 , · · · , xn)(1)A simple example might be a single random variable x with transformationy Φ (x) log (x)(2)1.2. Techniques for finding the distribution of a transformation of random variables.1.2.1. Distribution function technique. We find the region in x1 , x2 , x3 , . . . xn space such that Φ(x1 ,x2 , . . . xn ) φ. We can then find the probability that Φ(x1 , x2 , . . . xn ) φ, i.e., P[ Φ(x1 , x2 , . . . xn ) φ ] by integrating the density function f(x1, x2 , . . . xn ) over this region. Of course, FΦ (φ) is justP[ Φ φ ]. Once we have FΦ (φ), we can find the density by integration.1.2.2. Method of transformations (inverse mappings). Suppose we know the density function of x. Alsosuppose that the function y Φ (x) is differentiable and monotonic for values within its range forwhich the density f(x) 6 0. This means that we can solve the equation y Φ (x) for x as a functionof y. We can then use this inverse mapping to find the density function of y. We can do a similarthing when there is more than one variable X and then there is more than one mapping Φ.1.2.3. Method of moment generating functions. There is a theorem (Casella [2, p. 65] ) stating thatif two random variables have identical moment generating functions, then they possess the sameprobability distribution. The procedure is to find the moment generating function for Φ and thencompare it to any and all known ones to see if there is a match. This is most commonly done to seeif a distribution approaches the normal distribution as the sample size goes to infinity. The theoremis presented here for completeness.Theorem 1. Let FX (x) and FY (y) be two cummulative distribution functions all of whose moments exist.Thena: If X and Y hae bounded support, then FX (u) and FY (u) for all u if an donly if E Xr E Yr for allintegers r 0,1,2, . . . .b: If the moment generating functions exist and MX (t) MY (t) for all t in some neighborhood of 0,then FX (u) FY (u) for all u.For further discussion, see Billingsley [1, ch. 21-22] .Date: August 9, 2004.1

2TRANSFORMATIONS OF RANDOM VARIABLES2. D ISTRIBUTION F UNCTION T ECHNIQUE2.1. Procedure for using the Distribution Function Technique. As stated earlier, we find the region in the x1 , x2 , x3 , . . . xn space such that Φ(x1 , x2 , . . . xn ) φ. We can then find the probabilitythat Φ(x1 , x2 , . . . xn ) φ, i.e., P[ Φ(x1, x2 , . . . xn ) φ ] by integrating the density function f(x1, x2 ,. . . xn ) over this region. Of course, FΦ (φ) is just P[ Φ φ ]. Once we have FΦ(φ), we can find thedensity by integration.2.2. Example 1. Let the probability density function of X be given byf (x) 6 x (1 x), 0 x 10 otherwise(3)Now find the probability density of Y X3 .Let G(y) denote the value of the distribution function of Y at y and writeG( y ) P ( Y y ) P ( X3 y ) P X y1/3Z y1/3 6x (1 x)dx Z(4)0y 1/3 6 x 6 x2 d x0 1/33 x2 2 x3 y0 3 y2/3 2yNow differentiate G(y) to obtain the density function g(y)d G (y)dy d 2/3 2y3ydyg(y) 2 y 1/3 2 2(y 1/3(5) 1), 0 y 12.3. Example 2. Let the probability density function of x1 and of x2 be given byf ( x 1 , x2 ) 2 e x1 2 x2 , x1 0 , x2 00 otherwise(6)Now find the probability density of Y X1 X2. Given that Y is a linear function of X1 and X2,we can easily find F(y) as follows.Let FY (y) denote the value of the distribution function of Y at y and write

TRANSFORMATIONS OF RANDOM VARIABLESFY (y) P (Y y)Z y Z y x2 2 e x1 2 x2 d x1 d x2Z0 y 0 2 e x1 2 x2 y0 x2 d x20Z y 2 e y x2 2 x2 2 e 2 x2d x2Z0 y 2 e y x2 2 e 2 x2 d x20Z y 2 e 2 x2 2 e y x2 d x23(7)0Now integrate with respect to x2 as followsF (y) P (Y y)Z y 2 e 2 x2 2 e y x2 d x20 e 2 x2 2 e y x2 y0 e 2 y 2 e y y e0 2 e y(8) e 2 y 2 e y 1Now differentiate FY (y) to obtain the density function f(y)d F (y)dy d e 2 y 2 e y 1dy 2 e 2 y 2 e yFY (y) (9) 2 e 2 y ( 1 e y )2.4. Example 3. Let the probability density function of X be given by2 1x µ1 · e 2 ( σ ) x σ 2π 1( x µ )2 x · exp 2 σ22 π σ2fX ( x ) (10)Now let Y Φ(X) eX . We can then find the distribution of Y by integrating the density functionof X over the appropriate area that is defined as a function of y. Let FY (y) denote the value of thedistribution function of Y at y and write

4TRANSFORMATIONS OF RANDOM VARIABLESFY (y) P ( Y y)eX y PZ P ( X ln y) , y 0 1( x µ )2 d x, y 0· exp 2 σ22 π σ2ln y (11)Now differentiate FY (y) to obtain the density function f(y). In this case we will need the rules fordifferentiating under the integral sign. They are given by theorem 2 which we state below withoutproof.Theorem 2. Suppose that f and f xare continuous in the rectangleR { (x, t) : a x b , c t d}and suppose that u0 (x) and u1 (x) are continuously differentiable for a x b with the range of u0 (x) andu1 (x) in (c, d). If ψ is given byψ (x) Zu1 (x)(12)f(x, t) d tu0 (x)thendψ dx xZu1 (x)(13)f(x, t) d tu0 (x) f ( x, u1 (x) )d u1 (x)d u0 f (x, u0 (x)) dxdxZu1 (x)u0 (x) f (x, t)dt xIf one of the bounds of integration does not depend on x, then the term involving its derivative will be zero.For a proof of theorem 2 see (Protter [3, p. 425] ). Applying this to equation 11 we obtainih2R ln yFY (y) 1 2 · exp ( x2 σ2µ ) d x , y 02 π σi h( ln y µ )21 1FY0 (y) fY (y) ·exp 22σy2 π σ2 R lny ddy 1y 2 π σ2 12 π σ2· exph · exph ( ln y µ )22 σ2( x µ )22 σ2i i dx(14)

TRANSFORMATIONS OF RANDOM VARIABLES5TABLE 1. Outcomes, Probabilities and Number of Heads from Tossing a CoinFour Times.Element of sample space 5TTTH24/625TTTT16/6253. M ETHODValue of random variable X(x)4333322222211110OF TRANSFORMATIONS ( SINGLE VARIABLE )3.1. Discrete examples of the method of transformations.3.1.1. One-to-one function. Find a formula for the probability distribution of the total number ofheads obtained in four tosses of a coin where the probability of a head is 0.60.The sample space, probabilities and the value of the random variable are given in table 1.From the table we can determine the probabilities as169621621681, P (X 1) , P (X 2) , P (X 3) , P (X 4) 625625625625625The probability of 3 heads and one tail for all possible combinations isP (X 0) 33325555or 3 132.55Similarly the probability of one head and three tails for all possible combinations is 32225555or

6TRANSFORMATIONS OF RANDOM VARIABLES 1 33255There is one way to obtain four heads, four ways to obtain three heads, six ways to obtain twoheads, four ways to obtain one head and one way to obtain zero heads. These five numbers are 1,4, 6, 4, 1 which is a set of binomial coefficients. We can then write the probability mass function asf(x) 4x x 4 x32for x 0, 1, 2, 3, 455(15)This, of course, is the binomial distribution. The probabilities of the various possible random variables are contained in table 2.TABLE 2. Probability of Number of Heads from Tossing a Coin Four TimesNumber of 2581/625Now consider a transformation of X in the form Y 2X2 X. There are five possible outcomesfor Y, i.e., 0, 3, 10, 21, 36. Given that the function is one-to-one, we can make up a table describingthe probability distribution for Y.TABLE 3. Probability of a Function of the Number of Heads from Tossing a CoinFour Times.Y 2 * (# heads)2 # of headsProbabilityNumber of Headsf(x)xyg(y)016/6250 16/625196/6253 96/6252216/62510 216/6253216/62521 216/625481/62536 81/6253.1.2. Case where the transformation is not one-to-one. Now let the transformation of X be given by Z (6 - 2X)2. The possible values for Z are 0, 4, 16, 36. When X 2 and when X 4, Y 4. We canfind the probability of Z by adding the probabilities for cases when X gives more than one value asshown in table 4.

TRANSFORMATIONS OF RANDOM VARIABLES7TABLE 4. Probability of a Function of the Number of Heads from Tossing a CoinFour Times (not one-to-one).Y (6 - (# heads))2Number of Headsxyg(y)03216/62542, 4216/625 81/625 29716196/62536016/6253.2. Intuitive Idea of the Method of Transformations. The idea of a transformation is to considerthe function that maps the random variable X into the random variable Y. The idea is that if we candetermine the values of X that lead to any particular value of Y, we can obtain the probability of Yby summing the probabilities of those values of X that mapped into Y. In the continuous case, tofind the distribution function, we want to integrate the density of X over the portion of its spacethat is mapped into the portion of Y in which we are interested. Suppose for example that both Xand Y are defined on the real line with 0 X 1 and 0 Y 10. If we want to know G(5), we needto integrate the density of X over all values of x leading to a value of y less than five, where G(y) isthe probability that Y is less than five.3.3. General formula when the random variable is discrete. Consider a transformation definedby y Φ(x). The function Φ defines a mapping from the sample space of the variable X, to a samplespace for the random variable Y.If X is discrete with frequency function pX , then Φ(X) is discrete and has frequency functionp Φ(X) (t) XpX (x)(16)x : Φ (x) t XpX (x)x Φ 1 (t)The process is simple in this case. One identifies g 1(t) for each t in the sample space of therandom variable Y, and then sums the probabilities.3.4. General change of variable or transformation formula.Theorem 3. Let fX (x) be the value of the probability density of the continuous random variable X at x. Ifthe function y Φ(x) is differentiable and either increasing or decreasing (monotonic) for all values withinthe range of X for which fX (x) 6 0, then for these values of x, the equation y Φ(x) can be uniquely solvedfor x to give x Φ 1(y) w(y) where w(·) Φ 1(·). Then for the corresponding values of y, the probabilitydensity of Y Φ(X) is given by 1 fX Φ 1 ( y ) · d Φd y (y) (y)dΦ(x)fX [ w (y) ] · d w6 0g (y) fY (y) (17)dydx 0 fX [ w (y) ] · w (y) 0 otherwise

8TRANSFORMATIONS OF RANDOM VARIABLESProof. Consider the digram in figure 1.F IGURE 1. y Φ(x) is an increasing function.As can be seen from in figure 1, each point on the y axis maps into a point on the x axis, thatis, X must take on a value between Φ 1 (a) and Φ 1 (b) when Y takes on a value between a and b.Therefore P (a Y b) P Φ 1 (a) X Φ 1 (b)R Φ 1 (b)(18) Φ 1 (a) fX (x) d xWhat we would like to do is replace x in the second line with y, and Φ 1 (a) and Φ 1 (b) with aand b. To do so we need to make a change of variable. Consider how we make a u substitutionwhen we perform integration or use the chain rule for differentiation. For example if u h(x) thendu h0 (x) dx. So if x Φ 1(y), thendx Then we can writed Φ 1 (y )dy.dy

TRANSFORMATIONS OF RANDOM VARIABLESZfX (x) d x ZΦ 1 (y)fX9 d Φ 1(y)dydy(19)For the case of a definite integral the following lemma applies.Lemma 1. If the function u h(x) has a continuous derivative on the closed interval [a, b] and f is continuouson the range of h, thenZb0f ( h (x)) h (x) d x aZh (b)(20)f (u) duh (a)Using this lemma or the intuition from equation 19 we can then rewrite equation 18 as follows P ( a Y b ) P Φ 1 (a) X Φ 1 (b)R Φ 1 ( b ) Φ 1 (a) fX (x) d x Rb 1 a fX Φ 1 ( y ) d Φ d y (y ) d y(21)The probability density function, fY (y), of a continuous random variable Y is the function f(·)that satisfiesZP (a Y b) F (b) F (a) bfY (t) d t(22)aThis then implies that the integrand in equation 21 is the density of Y, i.e., g(y), so we obtaing (y) fY (y) fX Φ 1 (y) ·d Φ 1 (y)dy(23)as long asd Φ 1 (y)dyexists. This proves the lemma if Φ is an increasing function. Now consider the case where Φ is adecreasing function as in figure 2.As can be seen from figure 2, each point on the y axis maps into a point on the x axis, that is,X must take on a value between Φ 1 (a) and Φ 1(b) when Y takes on a value between a and b.ThereforeP (a Y b) PZ Φ 1 (b) X Φ 1 (a)Φ 1 ( a )fX (x) d xΦ 1 ( b )Making a change of variable for x Φ 1(y) as before we can write (24)

10TRANSFORMATIONS OF RANDOM VARIABLESF IGURE 2. y Φ(x) is a decreasing function.Φ 1 (b) X Φ 1 (a)P (a Y b) PZ Φ 1 (a)fX (x) d xΦ 1 (b)Z afXb ZbfXa d Φ 1 (y)dydy d Φ 1 (y)Φ 1 (y)dydyΦ 1 (y)Becaused Φ 1 (y)dx1 dydydydxwhen the function y Φ(x) is increasing and(25)

TRANSFORMATIONS OF RANDOM VARIABLES11d Φ 1 (y)dy is positive when y Φ(x) is decreasing, we can combine the two cases by writingg (y) fY (y) fX Φ 1 (y) d Φ 1 (y )dy·(26) 3.5. Examples.3.5.1. Example 1. Let X have the probability density function given byfX (x) 12x, 0 x 20 elsewhere(27)Find the density function of Y Φ(X) 6X - 3.Notice that fX (x) is positive for all x such that 0 x 1. The function Φ is increasing for all X.We can then find the inverse function Φ 1 as followsy 6x 3 6x y 3y 3 x Φ 1 (y)6(28)We can then find the derivative of Φ 1 with respect to y asd Φ 1d dydy1 6 y 36 (29)The density of y is then g (y) fY (y) fX Φ 1(y) · 13 y1 ,266d Φ 1 (y)dy3 y0 26(30)For all other values of y, g(y) 0. Simplifying the density and the bounds we obtaing (y) fY (y) 3 y72, 3 y 90 elsewhere(31)

12TRANSFORMATIONS OF RANDOM VARIABLES3.5.2. Example 2. Let X have the probability density function given by fX (x). Then consider thetransformation Y Φ(X) σX µ, σ 6 0. The function Φ is increasing for all X. We can then findthe inverse function Φ 1 as followsy σx µ σx y µy µ Φ 1 (y) x σWe can then find the derivative of Φ 1 with respect to y asd Φ 1d dydy1 σ y µσ (32)(33)The density of y is then d Φ 1 (y )Φ 1 (y) ·dy y µ1 fX·σσfY (y) fX3.5.3. Example 3. Let X have the probability density function given by e x , 0 x fX (x) 0 elsewhere(34)(35)Find the density function of Y X1/2.Notice that fX (x) is positive for all x such that 0 x . The function Φ is increasing for all X.We can then find the inverse function Φ 1 as follows1y x2 y2 x x Φ 1 ( y ) y2(36)We can then find the derivative of Φ 1 with respect to y asd Φ 1dy ddy y2 2y(37)The density of y is thenfY (y) fX e y 2Φ 1 ( y ) ·d Φ 1 (y )dy 2y A graph of the two density functions is shown in figure 3 .(38)

TRANSFORMATIONS OF RANDOM VARIABLES13F IGURE 3. The Two Density Functions.1Value of Density Function0.8fHxL0.60.4gHyL0.214. M ETHOD234OF TRANSFORMATIONS ( MULTIPLE VARIABLES )4.1. General definition of a transformation. Let Φ be any function from Rk to Rm , k, m 1, suchthat Φ 1 (A) x Rk : Φ(x) A ßk for every A ßm where ßm is the smallest σ - field having all theopen rectangles in Rm as members. If we write y Φ(x), the function Φ defines a mapping from thesample space of the variable X (Ξ) to a sample space (Y) of the random variable Ψ. SpecificallyΦ(x) : Ξ Ψ(39)Φ 1 (A) { x Ψ : Φ (x) A }(40)and4.2. Transformations involving multiple functions of multiple random variables.Theorem 4. Let fX1 X2 (x1 x2 ) be the value of the joint probability density of the continuous randomvariables X1 and X2 at (x1, x2). If the functions given by y1 u1 (x1, x2) and y2 u2 (x1, x2 ) are partiallydifferentiable with respect to x1 and x2 and represent a one-to-one transformation for all values within therange of X1 and X2 for which fX1 X2 (x1 x2) 6 0 , then, for these values of x1 and x2 , the equations y1 u1 (x1, x2) and y2 u2 (x1, x2) can be uniquely solved for x1 and x2 to give x1 w1 (y1 , y2 ) andx2 w2 (y1 , y2 ) and for corresponding values of y1 and y2, the joint probability density of Y1 u1(X1 ,X2 ) and Y2 u2(X1 , X2) is given byfY1 Y2 ( y1 y2 ) fX1 X2 [ w1 ( y1 , y2 ) , w2 ( y1 y2 ) ] · J where J is the Jacobian of the transformation and is defined as the determinant(41)

14TRANSFORMATIONS OF RANDOM VARIABLESJ x1y1 x2y1 x1y2 x2y2(42)At all other points fY1 Y2 (y1 y2 ) 0 .4.3. Example. Let the probability density function of X1 and X2 be given byfX1 X2 (x1, x2) (e ( x1 x2 )0for x1 0, x2 0elsewhereConsider two random variables Y1 and Y2 be defined in the following manner.Y 1 X1 X2X1Y2 X1 X2(43)To find the joint density of Y1 and Y2 and also the marginal density of Y2 . We first need to solvethe system of equations in equation 43 for X1 and X2.Y1Y2 X1 Y2 Y2 Y1 Y2 X2 X1 X1 X2 X1 X 1 X2 Y 1 X2 Y1 Y 1 X 2 X 2 X2 Y1 Y 1 X2 Y 1 X2 Y1 Y1 Y2 Y1 (1 Y2 ) Y1 ( Y1 Y1 Y2 ) Y1 Y2(44)The Jacobian is given byJ x1y1 x2y1 x1y2 x2y2y2y11 y2 y1 y2 y1 y1 ( 1 y2 ) y2 y1 y1 1 y1 y2 y1 (45)This transformation is one-to-one and maps the domain of X (Ξ) given by x1 0 and x2 0 inthe x1 x2 -plane into the domain of Y(Ψ) in the y1 y2-plane given by y1 0 and 0 y2 1. If weapply theorem 4 we obtain

TRANSFORMATIONS OF RANDOM VARIABLESfY1Y2(y1 y2 ) fX1 X2 [ w1 ( y1 , y2 ) , w2 ( y1 y2 ) ] · J e ( y1 y2 y1 y1 y2 ) y1 e y1 y1 y1 e y115(46)Considering all possible values of values of y1 and y2 we obtain(y1 e y1 for x1 0 , 0 x2 1fY1 Y2 (y1 , y2 ) 0elsewhereWe can then find the marginal density of Y2 by integrating over y1 as followsR fY2 (y1 , y2 ) R0 fY2 Y2 (y1 , y2 ) y1 0 y1 e y1 d y1(47)We make a uv substitution to integrate where u, v, du, and dv are define asu y1 v e y1d u d y1 dv e y1 d y1(48)R fY2 (y1 , y2 ) 0 y1 e y1 d y1 R y1 e y1 e y1 d y10 0 y1 (0 0 ) (e) 0 y1 0 (e) 0 0 e e0 0 0 1 1(49)This then impliesfor all y2 such that 0 y2 1.

16TRANSFORMATIONS OF RANDOM VARIABLESA graph of the joint densities and the marginal density follows. The joint density of (X1, X2) isshown in figure 4.F IGURE 4. Joint Density of X1 and X2 .x1210340.20.150.1f12Hx1,x2L0.05012304x2This joint density of (Y1, Y2) is contained in figure 5.F IGURE 5. Joint Density of Y1 and Y2.204y1680.30.2 f Hy ,y L1 20.100.250.5y20.7501

TRANSFORMATIONS OF RANDOM VARIABLES17This marginal density of Y2 is shown graphically in figure 6.F IGURE 6. Marginal Density of Y2.4y1206821.510.500.250.5y20.7501f2Hy2L

18TRANSFORMATIONS OF RANDOM VARIABLESR EFERENCES[1] Billingsley, P. Probability and Measure. 3rd edition. New York: Wiley, 1995[2] Casella, G. and R.L. Berger. Statistical Inference. Pacific Grove, CA: Duxbury, 2002[3] Protter, Murray H. and Charles B. Morrey, Jr. Intermediate Calculus. New York: Springer-Verlag, 1985

Method of moment generating functions. There is a theorem (Casella [2, p. 65] ) stating that if two random variables have identical moment generating functions, then they possess the same probability distribution. The procedure is to find the moment generating function for Φ and then co

Related Documents:

May 02, 2018 · D. Program Evaluation ͟The organization has provided a description of the framework for how each program will be evaluated. The framework should include all the elements below: ͟The evaluation methods are cost-effective for the organization ͟Quantitative and qualitative data is being collected (at Basics tier, data collection must have begun)

Silat is a combative art of self-defense and survival rooted from Matay archipelago. It was traced at thé early of Langkasuka Kingdom (2nd century CE) till thé reign of Melaka (Malaysia) Sultanate era (13th century). Silat has now evolved to become part of social culture and tradition with thé appearance of a fine physical and spiritual .

On an exceptional basis, Member States may request UNESCO to provide thé candidates with access to thé platform so they can complète thé form by themselves. Thèse requests must be addressed to esd rize unesco. or by 15 A ril 2021 UNESCO will provide thé nomineewith accessto thé platform via their émail address.

̶The leading indicator of employee engagement is based on the quality of the relationship between employee and supervisor Empower your managers! ̶Help them understand the impact on the organization ̶Share important changes, plan options, tasks, and deadlines ̶Provide key messages and talking points ̶Prepare them to answer employee questions

Dr. Sunita Bharatwal** Dr. Pawan Garga*** Abstract Customer satisfaction is derived from thè functionalities and values, a product or Service can provide. The current study aims to segregate thè dimensions of ordine Service quality and gather insights on its impact on web shopping. The trends of purchases have

Bruksanvisning för bilstereo . Bruksanvisning for bilstereo . Instrukcja obsługi samochodowego odtwarzacza stereo . Operating Instructions for Car Stereo . 610-104 . SV . Bruksanvisning i original

Chính Văn.- Còn đức Thế tôn thì tuệ giác cực kỳ trong sạch 8: hiện hành bất nhị 9, đạt đến vô tướng 10, đứng vào chỗ đứng của các đức Thế tôn 11, thể hiện tính bình đẳng của các Ngài, đến chỗ không còn chướng ngại 12, giáo pháp không thể khuynh đảo, tâm thức không bị cản trở, cái được

10 tips och tricks för att lyckas med ert sap-projekt 20 SAPSANYTT 2/2015 De flesta projektledare känner säkert till Cobb’s paradox. Martin Cobb verkade som CIO för sekretariatet för Treasury Board of Canada 1995 då han ställde frågan