Random Variables > Independent Random Variables
You may want to read this article first: What is a Random Variable?
What are Independent Random Variables?
An independent random variable is a random variable that doesn’t have an effect on the other random variables in your experiment. In other words, it doesn’t affect the probability of another event happening. For example, let’s say you wanted to know the average weight of a bag of sugar so you randomly sample 50 bags from various grocery stores. You wouldn’t expect the weight of one bag to affect another, so the variables are independent. The opposite is a dependent random variable, which does affect probabilities of other random variables.
Another way of looking at it:
Knowing the value of X, an independent random variable, doesn’t help us to predict a value for Y and vice versa.
A random variable is a variable associated with an experiment, like n tosses of a coin or d draws of cards. From a (more technical) standpoint, two random variables are independent if either of the following statements are true:
- P(x|y) = P(x), for all values of X and Y.
- P(x∩y) = P(x) * P(y), for all values of X and Y.
The two are equivalent.
The first statement, P(x|y) = P(x), for all values of X and Y, is stating “the probability of x, given y, is x.” In other words, knowing y should make no difference on the probability, x — it’s still going to be just x no matter what the value of y.
You may recognize the second one as the fundamental counting principle , which states that if you have two independent events, multiply their probabilities together. For example, let’s say your chances of winning a prize in bingo are 1/1000 and your odds of finding a parking space right next to the bingo hall are 1/20. Your odds of finding a parking space next to the bingo hall and winning in bingo are 1/1000 * 1/20 = 1/20,000. This might make intuitive sense, but it’s not any kind of proof.
How to Tell if Random Variables are Independent
You can tell if two random variables are independent by looking at their individual probabilities, plus how they change when they meet. For example, let’s say you have two random variables X and Y. X can equal 0, 1, or 2 and Y can equal 0 or 1
- The probability that X = 0 is 20%: P(X=1) = 0.2.
- The probability that X = 1 is 30%: P(X=3) = 0.3.
- The probability that X = 2 is 50%: P(X=5) = 0.5.
- The probability that Y is 0 is 40%: P(Y=0) = 0.4.
- The probability that Y is 1 is 60%: P(Y=1) = 0.4.
But what happens to the probabilities when the two happen at the same time?
For each possible combination of X And Y, the probabilities are:
- P(x=0 | y=0) = 0.2;
- P(x=1 | y=0) = 0.4;
- P(x=2 | y=0) = 0.4;
- P(x=0 | y=1) = 0.2;
- P(x=1 | y=1) = 0.4;
- P(x=2 | y=1) = 0.4;
No matter what the probability for the Y value, X does not change. And no matter what the probability for the X value, Y does not change. Therefore, these are independent random variables.Comments are now closed for this post. Need help or want to post a correction? Please post a comment on our Facebook page and I'll do my best to help!