Probability is a powerful tool for making informed decisions, especially when not everything is known. But what happens when you’re given partial information — like a positive test result, or getting a quiz question correct? Conditional probability helps us update our beliefs in such situations.
In this article, we’ll explore:
- What conditional probability really means
- Key rules: Chain Rule, Law of Total Probability, and Bayes’ Theorem
- 5 fully worked-out real-world examples
🧠 What is Conditional Probability?
Conditional probability is the probability of an event occurring given that another event has already occurred.
P(A | B) = P(A ∩ B) / P(B)
This means: out of the scenarios where B happens, how many also include A?
Example:
What is the chance that it’s an Ace given the card is red?
🔗 Chain Rule of Probability
The chain rule helps us build up joint probabilities using conditional ones:
P(A ∩ B) = P(A | B) × P(B)
P(A ∩ B ∩ C) = P(C | A ∩ B) × P(B | A) × P(A)
📊 Law of Total Probability
Used when outcomes arise from multiple mutually exclusive causes:
P(B) = P(B | A₁) × P(A₁) + P(B | A₂) × P(A₂) + ...
🔁 Bayes’ Theorem
Used to reverse conditional probabilities:
P(A | B) = [P(B | A) × P(A)] / P(B)
It shines in diagnosis, decision-making, and machine learning.
💡 Worked Example 1: Multiple Choice Theory
A student answers a multiple-choice question.
- Knows the concept:
P(K) = 3/4 - Guessing correctly:
P(C | ¬K) = 1/4 - Gets it right even if they know:
P(C | K) = 9/10
What is P(K | C) — the chance they knew it given they got it correct?
Step 1: Use Bayes’ Theorem
P(C) = (9/10)(3/4) + (1/4)(1/4)
= 27/40 + 1/16
= (108 + 5) / 160 = 113 / 160
P(K | C) = (9/10 × 3/4) / (113/160)
= (27/40) ÷ (113/160)
= (27 × 160) / (40 × 113)
= 54 / 59
🧪 Example 2: Medical Test Accuracy
A disease affects 1 in 1000. Test sensitivity = 99%, specificity = 98%. You test positive.
P(D) = 0.001, P(¬D) = 0.999 P(T⁺ | D) = 0.99, P(T⁺ | ¬D) = 0.02 P(T⁺) = 0.00099 + 0.01998 = 0.02097 P(D | T⁺) = 0.00099 / 0.02097 ≈ 0.0472
☔ Example 3: Rain and Umbrella
Friend carries an umbrella. What’s the chance it’s raining?
P(R) = 0.3, P(U | R) = 0.9, P(U | ¬R) = 0.2 P(U) = 0.27 + 0.14 = 0.41 P(R | U) = 0.27 / 0.41 ≈ 0.6585
🃏 Example 4: Cards
Probability of Ace given the card is red?
P(Ace ∩ Red) = 2/52, P(Red) = 26/52 P(Ace | Red) = (2/52) / (26/52) = 2/26 = 1/13
🏭 Example 5: Faulty Factory Machine
- Machine A: 30%, defect = 2%
- Machine B: 50%, defect = 1%
- Machine C: 20%, defect = 3%
Find P(C | Defect)
P(D) = 0.006 + 0.005 + 0.006 = 0.017 P(C | D) = 0.006 / 0.017 ≈ 0.3529
🧠 Final Thoughts
Conditional probability helps us make better decisions with limited info.
- Chain Rule = builds from conditional steps
- Law of Total Probability = combines causes
- Bayes’ Theorem = reverses conditionals
Whether you’re a student, analyst, or clinician — these tools are essential to your decision-making toolkit.





Drop your thoughts in the comments below!