A probability tree diagram is one way to study the sample space of an experiment. They are often helpful for studying outcomes and probabilities for multistage experiments. Any situation where an “experiment” is multistage can serve as a tree diagram real life example.
Tree Diagram Real Life Example List
- Accepting a job offer: the branches would represent each benefit (like increased salary) or downside (like a longer commute).
- Turning left or right at a series of intersections (useful for traffic planning).
- Drawing certain cards in succession (handy if you play cards),
- Sports probabilities, like a basketball player getting x baskets with x independent free throws.
- Assessing the probability of a volcanic eruption; the U.S. government used a tree to predict an eruption on Mount Pinatubo:
Decision trees are also used as a foundation for a machine learning method. Decision tree learning uses a predictive model based on a decision tree. This leads to classification trees, where the variables take on a set of discrete values and regression trees where variables are continuous [1].
Tree Diagram Real Life Example: Appearing on a Game Show
Let’s say you’re appearing on a game show. You are asked to choose one of three doors. Behind two of the doors, there is a goat. Behind the other door there is a car. After you’ve made your choice, the host opens one door with a goat and asks you if you want to switch. Should you switch or stay? This particular problem is called the Monty Hall problem, and it has confounded contestants and mathematicians for decades. One way to approach the problem is to calculate probabilities with a probability tree.
The root (at the far left) represents the initial state, before the car (or goats) have been placed. The next three vertices in the tree (moving to the right) represent the experimental state after the prizes have been placed, but before you pick a door. Each “leaf” on the left is labeled with a W or L indicating if you would win or lose with a particular strategy. Note that with this tree, we haven’t labeled the probabilities yet for each vertex; You would want to take this extra step to see which outcomes are more likely.
References
[1] Yanh, T. Supervised Learning for Machine Learning using R – 2. Decision Tree. Retrieved November 1, 2021 from: https://tiantiy.people.clemson.edu/blog/2019/Machine%20Learning/R2_ML_SupervisedLearning_DecisionTree.html
[2] Nagpal, R. (2002). Introduction to Probability. Retrieved November 1, 2021 from: http://cs.brown.edu/courses/cs151/reference/probability.pdf