Connexions

You are here: Home » Content » Sums of Random Variables

Sums of Random Variables

Module by: Nick Kingsbury

Summary: This module introduces sums of random variables.

Consider the random variable YY formed as the sum of two independent random variables X 1 X 1 and X 2 X 2 :
Y=X1+X2 Y X1 X2 (1)
where X 1 X 1 has pdf f1x1 f1 x1 and X 2 X 2 has pdf f2x2 f2 x2 .
We can write the joint pdf for yy and x 1 x 1 by rewriting the conditional probability formula:
fyx1=fy|x1f1x1 f y x1 f | y x1 f1 x1 (2)
It is clear that the event 'YY takes the value yy conditional upon X1=x1 X1 x1 ' is equivalent to X 2 X 2 taking a value y-x1 y x1 (since X2=Y-X1 X2 Y X1 ). Hence
fy|x1=f2y-x1 f | y x1 f2 y x1 (3)
Now fy f y may be obtained using the Marginal Probability formula (this equation from this discussion of probability density functions). Hence
fy=fy|x1f1x1dx1=f2y-x1f1x1dx1=f2*f1 f y x1 f | y x1 f1 x1 x1 f2 y x1 f1 x1 f2 f1 (4)
This result may be extended to sums of three or more random variables by repeated application of the above arguments for each new variable in turn. Since convolution is a commutative operation, for nn independent variables we get:
fy=fn*f n - 1 **f2*f1=fn*f n - 1 **f2*f1 f y fn f n - 1 f2 f1 fn f n - 1 f2 f1 (5)
An example of this effect occurs when multiple dice are thrown and the scores are added together. In the 2-dice example of the subfigures a,b,c of this figure in the discussion of probability distributions, we saw how the pmf approximated a triangular shape. This is just the convolution of two uniform 6-point pmfs for each of the two dice.
Similarly if two variables with Gaussian pdfs are added together, we shall show in the discussion of the summation of two or more Gaussian random variables that this produces another Gaussian pdf whose variance is the sum of the two input variances.

Comments, questions, feedback, criticisms?

Discussion forum

Send feedback