Independent random variables

An independent random variable is a random variable that doesn’t have an effect on the other random variables in a trial. In other words, it doesn’t affect the probability of another event happening. As we can see, the concept of independent random variables is similar to independent events.


Let $X$ and $Y$ be discrete random variables with values $(a_{i})_{i=1}^{\infty}$ and $(b_{j})_{j=1}^{\infty}$ respectively. We say $X$ and $Y$ are independent if:

$$P(X=a_{i}, Y= b_{j})=P(X=a_{i}) \cdot P(Y=b_{j}), \forall i,j \in \mathbb{N}$$

In general, if the two random variables are independent we can write $$P(X \in A, Y \in B)=P(X \in A) \cdot P(Y \in B)$$

Expected value and variance

(a) If $X$ and $Y$ independent random variables, then $$E[X \cdot Y]=E[X]\cdot E[Y]$$ Generally if  $X_{1} … X_{n}$ are  independent  $$E[\prod_{i=1}^{n}X_{i}]= \prod_{i=1}^{n} E[X_{i}]$$
(b) If $X$ and $Y$ independent random variables, then $$Var(X+Y)=Var(X)+Var(Y)$$ Generally, if $X_{1} … X_{n}$ are independent  $$Var(\sum_{i=1}^{n}X_{i})= \sum_{i=1}^{n} VarX_{i}$$

(*) Remember, $E[X + Y]=E[X] + E[Y]$ for every $X$ and $Y$. They don’t need to be independent.

Example 1

 Let $X$ and $Y$ be independent random variables. For $X \sim P(3)$ and $Y \sim P(3)$ calculate $P(X+Y=1)$ and $E[(X-Y)^{2}]$.


$X$ and $Y$ are Poisson random variable with parameter $3$. Poisson variables have values in $\mathbb{No}$ which means the only way their sum can be equal to $1$ is if $\{X=1, Y=0\}$ and $\{X=0, Y=1\}$. Since these two options are mutually exclusive, we have $$P(X+Y=1)=P(X=0, Y=1)+P(X=1,Y=0)$$ Furthermore, since $X$ and $Y$ are independent we can use the definition above

$$P(X+Y=1)=P(X=0) \cdot P(Y=1) +P(X=1) \cdot P(Y=0) $$

Using the Poisson’s distribution for $\lambda=3$, we have$$\displaystyle{P(X=0) \cdot P(Y=1)=e^{-3} \cdot \frac{3^{0}}{0!} \cdot e^{-3} \cdot \frac{3^{1}}{1!}}$$

Similarly, $$\displaystyle{P(X=1) \cdot P(Y=0)=e^{-3} \cdot \frac{3^{1}}{1!} \cdot e^{-3} \cdot \frac{3^{0}}{0!}}$$

As a result, $P(X+Y=1)= e^{-6} \cdot 6$.

Now we have to find $E[(X-Y)^{2}]$. By squaring the expression we get $$E[(X-Y)^{2}]=E[X^{2}-2XY+Y^{2}]$$ By using the property of expected value $E[X+Y]=EX+EY$ and the fact that $X$ and $Y$ are independent random variables we have $$E[(X-Y)^{2}]=E[X^{2}]-2E[X]E[Y]+E[Y^{2}]$$

Lets calculate the needed expected values separately. In lesson Expected value, variance and standard deviation we learned that for Poisson random variable expected value and variance look like this:

As a result, its expected value is: $EX=\lambda $

Furthermore, variance is: $Var(X)=\lambda $

Accordingly, $EX=3$, $EY=3$ and $E[X^{2}]=VarX + (EX)^{2}= 3 + 3^{2}= 12$. Similarly, $E[Y^{2}]=12$.

Finally, when we put the calculated values in the expression above we get $$E[(X-Y)^{2}]=12-2 \cdot 3 \cdot 3 +12=6$$

Proposition. Let $X_{1}, …, X_{n}$ be independent Bernoulli random variables with parameter $p$, more so $X_{i} \sim B(1,p)$ for $i\in \mathbb{N}$. Using the property of independent random variables we have:

$\bullet$ $E[X]=E[\sum_{i=1}^{n} X_{i}]= \sum_{i=1}^{n} E[X_{i}]= n\cdot p$

$\bullet$ $Var(X)=Var(\sum_{i=1}^{n} X_{i})= \sum_{i=1}^{n} Var(X_{i})= n\cdot p \cdot (1-p)$

Proposition. If random variable $X$ has values in $\mathbb{N_{0}}$  then we have $E[X]=\sum_{n=0}^{\infty} P(X>n)$

For example, $X\sim G(p), p>0$, meaning $X$ geometric random variable.  We saw that $P(X>n)=(1-p)^{n}=q^{n}$, $\forall n \in \mathbb{N_{0}}$. As a result, $E[X]=\sum_{n=0}^{\infty} P(X>n)= \sum_{n=0}^{\infty} q^{n}=\frac{1}{1-q}=\frac{1}{p}$

Example 2

Lets roll symmetric die until we get $6$ for the second time and let $X$ be the number of needed rolls. Find

(a) $P(X \leq 4)$
(b) $E[X], Var(X)$


At first, $X$ looks like geometric random variable. However, $G(1,p)$ represents number of independent trails until first success. In this case, we need the number of trials until second success. The trials are not independent. But there is a trick we can use. We can separate $X$ into two random variables, $X_{1}$ and $X_{2}$.

$\bullet$ $X_{1}$ represents the number of trials before we get the first six (success)
$\bullet$ $X_{2}$ represents the number of trials that occur after the first success but before the second success

Each one is geometric random variable where with $p=\frac{1}{6}$. As a result, we look at $X$ as a joint of two geometric variables until we get first six, $X=X_{1} + X_{2}$.

Even thought we decided to separate the random variable $X$ into two geometric random variables, lets manually calculate the distribution of $X$.

To get two $6$’s we have to roll a die minimally 2 times. We don’t know how many rolls we’ll need. As a result,  $X \in \{2,3,4,…\}=\mathbb{N} \setminus \{1\}$.

Now lets find the probability of having $n$ rolls. Each roll has $6$ possible outcomes and since we have $n$ rolls, there are $6^{n}$ possible outcomes in total. We know that the $n$th roll will be $6$. $n-1$ will be the rest of the total rolls, from which we’ll have one more $6$ and $n-2$ numbers from $0$ to $5$ included. To be precise, we’ll have $5$ possibilities for each of the $n-1$ roll, we’ll have $2$ rolls with only one possible outcome – number $6$. And we also have $n-1$ possible rolls in which we could get a $6$. As a result, probability of $X=n$ is equal to $$\displaystyle{P(X=n)= \frac{(n-1)\cdot 1 \cdot 1 \cdot 5^{n-2}}{6^{n}}}$$

(a) We know that $P(X\leq 4)=P(X=2)+P(X=3)+P(X=4)$.  From the equation above it follows $$\displaystyle{P(X\leq 4)= \frac{1\cdot 5^{0}}{6^{2}} + \frac{2\cdot 5^{1}}{6^{3}} + \frac{3\cdot 5^{2}}{6^{4}} \approx 0.104}$$

(b)  For calculating $E[X]$ and $Var(X)$ we’ll use the trick we mentioned before. $X=X_{1}+X_{2}$, with $X_{1}, X_{2}\sim G(\frac{1}{6})$. We will treat those two variables as two of the same trials who went on one after another. $X_{1}$ will be the number of trials before getting a first six, and $X_{2}$ will be the number of trials before getting a first six after we already got one. We will act like we restarted the trial after we got the first six. This makes things a whole lot of easier. Since, $X=X_{1} + X_{2}$ $$\displaystyle{E[X]=E[X_{1}+X_{2}]=E[X_{1}] + E[X_{2}]= \frac{1}{\frac{1}{6}}+ \frac{1}{\frac{1}{6}} = 6+6=12}$$ Similarly, since $X_{1}$ and $X_{2}$ are independent,

$$\displaystyle{Var(X_{1}+X_{2})=Var(X_{1}) + Var(X_{2})=\frac{\frac{5}{6}}{(\frac{1}{6})^{2}} +\frac{\frac{5}{6}}{(\frac{1}{6})^{2}}=60}$$

Proposition. If random variable $X$ with values $\{a_{1}, a_{2},….\}$ and probabilities $\{p_{1}, p_{2},….\}$, has only non-negative values ($a_{i}\geq 0, \forall i \in \mathbb{N}$) and $\sum_{i=1}^{\infty} a_{i}p_{i}=+\infty$ then $EX=+\infty$


A+ Corner

For those of you who want to know more, we’ll show you how to find the expected value of $X$ from the previous example without using the trick with geometric random variables.

We potentially have infinite number of rolls with minimum of two.  Moreover, each roll has the same probability. As a result, our expected value is equal to $$E[X]=\sum_{n=2}^{\infty} n\cdot P(X=n)= \sum_{n=2}^{\infty} n(n-1)\frac{5^{n-1}}{6^{n}}$$ How do we solve this equation? Lets try to rearrange the factors. We can extract $\frac{1}{6^{2}}$ and get $$\displaystyle{E[X]=\frac{1}{6^{2}}\sum_{n=2}^{\infty} n(n-1) \big(\frac{5}{6}\big)^{n-2}}$$ This now reminds us of a geometric series. As a matter of fact it’s the second derivative of geometric series.

Generally, geometric series is $\sum_{n=0}^{\infty} x^{n}=\frac{1}{1-x}$, for $|x|<1$. The first derivative with the respect to given $x$ is $$\sum_{n=0}^{\infty} nx^{n-1}=\frac{1}{(1-x)^{2}}$$ When we remove trivial members (for $n=0$), we have $$\sum_{n=1}^{\infty} nx^{n-1}=\frac{1}{(1-x)^{2}}$$ Now the second derivative looks like $$\sum_{n=2}^{\infty} n(n-1)x^{n-2}=\frac{2}{(1-x)^{3}}$$We can see that exactly our probability with $x=\frac{5}{6}$.

As a result, $$\displaystyle{EX=\frac{1}{6^{n}} \cdot \frac{2}{1-(\frac{5}{6})^{3}}}=12$$

There is another way we could solve this problem. Notice we could look at the rolls as two separate events of getting the first 6’s, $X=X_{1}+X_{2}$. When we get the first 6, we can pretend pretend we restarted the trial. Both $X_{1}$ and $X_{2}$ are geometric random variables with parameter $\frac{1}{6}$.