Conditional events
This chapter delves into the core concepts of conditional probability, the multiplication rule, and the crucial distinction between independent and dependent events. A comprehensive understanding of these principles is essential for accurately modeling real-world probabilistic scenarios and is consistently a key area evaluated in CMI examinations.
---
Chapter Contents
|
| Topic |
|---|-------| | 1 | Conditional probability definition | | 2 | Multiplication rule | | 3 | Independence | | 4 | Dependent event reasoning |---
We begin with Conditional probability definition.
Part 1: Conditional probability definition
Conditional Probability Definition
Overview
Conditional probability measures the probability of an event after new information is known. This is one of the most important ideas in probability because many real problems do not ask for the chance of an event in isolation; they ask for the chance of an event given that something else has already happened. In exam problems, the main difficulty is choosing the correct reduced sample space. ---Learning Objectives
After studying this topic, you will be able to:
- State the definition of conditional probability.
- Interpret the condition as a restriction of the sample space.
- Compute from given probabilities.
- Use the multiplication relation
.
- Distinguish conditional probability from ordinary probability.
Definition
If , then the conditional probability of given is
- we now look only inside the event
- among those outcomes, we ask what fraction also lie in
Interpretation
The event becomes the new universe of possible outcomes.
So conditional probability is not about changing the event ; it is about changing the sample space to .
Immediate Consequences
From the definition,
Also,
Conditional Probability and Independence
Events and are independent if
Minimal Worked Examples
Example 1 A card is chosen from a standard deck. Let:- = the card is a king
- = the card is a face card
- = outcome is even
- = outcome is greater than
Common Patterns
- condition given explicitly:
- reverse computation from
and
- card, dice, and subset problems with reduced sample spaces
- checking independence by testing whether
Common Mistakes
- ❌ using instead of
- ❌ forgetting the condition changes the sample space
- ❌ applying the formula when
- ❌ mixing with
CMI Strategy
- Identify the conditioning event first.
- Restrict attention to that event.
- Count or compute the overlap with the target event.
- Use
- If the sample space is small, list outcomes directly.
- If given probabilities are numerical, use the formula directly.
Practice Questions
:::question type="MCQ" question="If , then is equal to" options=["","","",""] answer="B" hint="Recall the definition exactly." solution="By definition, provided . Hence the correct option is ." ::: :::question type="NAT" question="A fair die is rolled. Let be the event 'even' and be the event 'greater than '. Find ." answer="1/2" hint="Restrict the sample space to ." solution="The event Within this reduced sample space, the even outcomes are So Hence the answer is ." ::: :::question type="MSQ" question="Which of the following statements are true?" options=[" when ","Conditional probability uses a reduced sample space","If and are independent, then ","In general "] answer="A,B,C" hint="Recall definition and independence." solution="1. True.Summary
- Conditional probability means probability inside a reduced sample space.
- The definition is
.
- The multiplication rule comes directly from the definition.
- Independence means the condition does not change the probability.
- Clear identification of the conditioning event is the whole game.
---
Proceeding to Multiplication rule.
---
Part 2: Multiplication rule
Multiplication Rule
Overview
The multiplication rule is the main tool for finding the probability that several events happen together. It becomes especially important in sequential experiments, where the probability of a full path is built step by step. In exam problems, the rule often appears in card drawing, ball drawing, repeated trials, and multi-stage decisions. ---Learning Objectives
After studying this topic, you will be able to:
- Use the multiplication rule for two events.
- Extend it to three or more events.
- Handle both independent and dependent sequential processes.
- Compute probabilities of ordered event sequences.
- Recognize when multiplication must be combined with case-splitting or addition.
Core Rule
For events and with ,
- first event happens,
- then second event happens given the first.
Also,
Chain Rule
For events ,
Independent Case
If and are independent, then
so the multiplication rule becomes
Ordered vs Unordered
The multiplication rule naturally computes ordered sequences.
For example:
- first red then blue,
- head then tail,
- first chosen is a boy and second chosen is a girl.
If the event can happen in several orders, then you often need to add the probabilities of those paths.
Minimal Worked Examples
Example 1 A bag contains red, green, and blue ball. Two balls are drawn without replacement. The probability that the first is red and the second is green is --- Example 2 A fair coin is tossed three times. The probability of getting HHT is because the tosses are independent. ---Multiplication + Addition Together
If an event can happen in several disjoint ways, then:
- use multiplication rule for each ordered path,
- add the path probabilities.
Example:
To get exactly one black ball in two draws, you may have:
- black then non-black
- non-black then black
So the total probability is the sum of the two path probabilities.
Common Mistakes
- ❌ Multiplying unconditional probabilities in a dependent setting.
- ❌ Forgetting to condition later steps on earlier steps.
- ❌ Using multiplication when the paths should first be split into cases.
- ❌ Treating unordered events as a single ordered path.
CMI Strategy
- Write the event in a clear time order.
- Compute each step using the correct current probability.
- Multiply along the path.
- If several valid paths exist, add them at the end.
- Check whether the process is with replacement or without replacement.
Practice Questions
:::question type="MCQ" question="A bag contains red, green, and blue ball. Two balls are drawn without replacement. The probability that the first is red and the second is green is" options=["","","",""] answer="B" hint="Multiply the probability of the first step by the conditional probability of the second." solution="The probability that the first ball is red is After removing a red ball, balls remain, of which are green. So Hence the required probability is So the correct option is ." ::: :::question type="NAT" question="A fair coin is tossed four times. Find the probability of the ordered outcome HTTH." answer="1/16" hint="Use independence and multiply four factors of ." solution="Each toss has probability , and the four tosses are independent. So Hence the answer is ." ::: :::question type="MSQ" question="Which of the following statements are true?" options=["For events with , ","For independent events , ","The multiplication rule is naturally suited to sequential experiments","For dependent events, one may always replace by "] answer="A,B,C" hint="One statement ignores dependence." solution="1. True.Summary
- The multiplication rule computes the probability of several events happening together.
- In dependent settings, later probabilities must be conditional.
- For independent events, the rule simplifies to plain multiplication.
- Ordered paths are multiplied; multiple disjoint paths are then added.
- The multiplication rule is one of the most important tools in sequential probability.
---
Proceeding to Independence.
---
Part 3: Independence
Independence
Overview
Independence is one of the most important ideas in probability, but also one of the most misunderstood. Two events are independent if knowing that one happened does not change the probability of the other. In CMI-style questions, the difficulty often comes from hidden conditioning, multi-stage experiments, and situations where events look symmetric but are actually dependent because of a common first step. This topic is especially important for conditional probability, because many problems ask whether events are independent unconditionally, conditionally, or neither. ---Learning Objectives
After studying this topic, you will be able to:
- test whether two events are independent,
- compute conditional probabilities involving independent events,
- distinguish unconditional independence from conditional independence,
- handle multi-stage experiments with a hidden first random choice,
- avoid common mistakes such as confusing disjointness with independence.
Core Definition
Two events and are independent if
provided all probabilities are taken in the same experiment.
If , then independence of and is equivalent to
Similarly, if , it is also equivalent to
How to Test Independence
To check whether and are independent:
- compute ,
- compute ,
- compute ,
- compare with .
Independence Is Not Disjointness
Disjoint events satisfy
so
If and are disjoint and both have positive probability, then they cannot be independent, because
- disjointness means the events cannot occur together,
- independence means occurrence of one gives no information about the other.
Independence and Complements
If and are independent, then the following pairs are also independent:
- and
- and
- and
Multi-Stage Experiments
In many problems, two visible outcomes are generated after a hidden first random choice.
In such cases:
- the visible outcomes may be independent given the first stage,
- but not independent overall.
This is one of the most common exam traps.
Conditional Independence
Events and are conditionally independent given an event or variable if
This does not automatically imply that and are independent without conditioning.
Minimal Worked Example: Hidden Common Cause
Example 1 A fair coin is tossed.- If heads occurs, both daughters win.
- If tails occurs, each daughter independently wins with probability .
Standard Probability Identities
For any events with :
Also,
If and are independent, then
Pairwise vs Mutual Independence
For three events :
- pairwise independence means each pair is independent,
- mutual independence means
- each pair is independent, and
-
Common Patterns
- two independent tosses or throws,
- repeated Bernoulli trials,
- hidden first-stage random choice causing dependence,
- checking independence by direct calculation,
- conditional-probability questions asking whether one event changes another.
Common Mistakes
- ❌ assuming symmetry implies independence,
- ❌ confusing disjoint events with independent events,
- ❌ treating conditional independence as ordinary independence,
- ❌ checking only without ensuring ,
CMI Strategy
- Define the events clearly.
- If the experiment has stages, split according to the first stage.
- Compute marginals and intersections separately.
- Check independence only after exact calculation.
- In conditional questions, write everything in terms of intersection over conditioning event.
Practice Questions
:::question type="MCQ" question="If and are independent with and , then equals" options=["","","",""] answer="A" hint="Use the product rule for independent events." solution="Since and are independent, . Hence the correct option is ." ::: :::question type="NAT" question="Suppose and are independent, , and . Find ." answer="0.3" hint="Use complement independence or subtract from ." solution="Because and are independent, So Hence the answer is ." ::: :::question type="MSQ" question="Which of the following statements are true?" options=["If and are independent, then ","If and are disjoint and both have positive probability, then they are independent","If and are independent, then and are independent","Conditional independence always implies unconditional independence"] answer="A,C" hint="Independence, disjointness, and conditional independence are different concepts." solution="1. True. This is the definition of independence.Summary
- Independence means .
- Equivalent conditional forms are valid when conditioning probability is positive.
- Disjointness and independence are very different.
- Hidden first-stage randomness can create dependence.
- Conditional independence does not imply unconditional independence.
- Always compute, do not guess.
---
Proceeding to Dependent event reasoning.
---
Part 4: Dependent event reasoning
Dependent Event Reasoning
Overview
Events are called dependent when the occurrence of one event changes the probability of another. This is one of the most important probability ideas in exam problems involving cards, balls, arrangements, and sequential choices without replacement. In CMI-style questions, the real test is whether you correctly update the sample space after earlier information is known. ---Learning Objectives
After studying this topic, you will be able to:
- Recognize when two events are dependent.
- Update probabilities after new information is revealed.
- Work correctly with “without replacement” situations.
- Use conditional probability language in dependent settings.
- Avoid treating dependent events as independent.
Core Idea
Two events and are dependent if knowing that has happened changes the probability of .
That is, dependence appears when
- an object has been removed,
- information has been revealed,
- one choice affects the remaining possibilities.
Standard Sequential View
When objects are chosen without replacement, later probabilities must be computed from the reduced set.
Example:
If a bag has red and blue balls, then after drawing one red ball, the composition becomes
Conditional Probability Language
The expression
means:
the probability of event given that event has already occurred.
Typical Dependent Situations
Dependent reasoning appears in:
- drawing cards without replacement
- drawing balls from an urn without replacement
- arranging objects step by step
- revealing partial information
- “given that at least one” type conditions
Updating the Sample Space
After an event occurs, do not continue using the old denominator automatically.
Always ask:
- what outcomes are still possible now?
- how many favourable outcomes remain now?
Minimal Worked Examples
Example 1 A bag has white and black balls. One ball is drawn and not replaced. Given that the first ball was black, the probability that the second is black is because now only black remain out of total balls. --- Example 2 A bag has red and blue balls. Given that the first ball drawn is red, the probability that the second ball is blue is This is not the original blue probability . ---“At Least One” Conditioning
Statements like
- “given that at least one selected ball is black”
- “given that at least one toss is a head”
change the sample space.
You must condition on the reduced event, not on the original total sample space.
Common Mistakes
- ❌ Treating “without replacement” as if probabilities stay unchanged.
- ❌ Multiplying unconditional probabilities in a dependent setting.
- ❌ Ignoring the information in “given that” statements.
- ❌ Forgetting to update the denominator after one event occurs.
- ❌ Using symmetry where the events are not symmetric after conditioning.
CMI Strategy
- Identify what information is already known.
- Rewrite the state of the system after that information.
- Count the remaining favourable outcomes.
- Count the remaining total outcomes.
- Only then compute the probability.
Practice Questions
:::question type="MCQ" question="A bag contains red and blue balls. Two balls are drawn without replacement. Given that the first ball is red, the probability that the second is blue is" options=["","","",""] answer="B" hint="Update the composition after the first red ball is removed." solution="After one red ball is removed, the bag contains red and blue balls, so the probability that the second ball is blue is Hence the correct option is ." ::: :::question type="NAT" question="A bag contains white and black balls. Two balls are drawn without replacement. Given that the first ball drawn is black, find the probability that the second ball is also black." answer="3/8" hint="Condition on the updated bag after one black ball is removed." solution="After one black ball is drawn, the bag contains white and black balls, so there are balls left in total. Hence the probability that the second ball is black is So the answer is ." ::: :::question type="MSQ" question="Which of the following situations involve dependent events?" options=["Drawing two cards from a deck without replacement","Drawing one ball, replacing it, and then drawing another","Choosing two students one after another without replacement","Tossing a fair coin twice"] answer="A,C" hint="Check whether the first outcome changes the second probability." solution="1. Dependent, because the first card changes the deck.Summary
- Dependent events require updating probabilities after information is known.
- Without replacement is the standard source of dependence.
- The phrase “given that” must change the sample space.
- Conditional reasoning is usually the cleanest language for dependent events.
- The hardest part is not the arithmetic — it is identifying the correct updated state.
Chapter Summary
- Conditional probability, , quantifies the likelihood of event A occurring given that event B has already occurred, formally defined as for .
- The Multiplication Rule, , is fundamental for calculating the probability of the intersection of events, particularly useful in sequential processes or when events are dependent.
- Two events A and B are independent if the occurrence of one does not affect the probability of the other. This is mathematically expressed as , or equivalently, .
- Events are dependent if the probability of one event is influenced by the occurrence or non-occurrence of another, meaning .
- Careful interpretation of problem statements is crucial to distinguish between joint probabilities (), conditional probabilities ( or ), and marginal probabilities ( or ).
- Tree diagrams are a powerful visual tool for breaking down complex scenarios involving sequences of dependent events, aiding in the application of the multiplication rule and calculation of conditional probabilities.
Chapter Review Questions
:::question type="MCQ" question="An urn contains 5 red balls and 3 blue balls. Two balls are drawn without replacement. What is the probability that the second ball drawn is red, given that the first ball drawn was blue?" options=["","","",""] answer="" hint="Consider the composition of the urn after the first ball is drawn." solution="Let R1 be the event that the first ball is red, and B1 be the event that the first ball is blue. Let R2 be the event that the second ball is red.
We are looking for .
If the first ball drawn was blue, then there are 7 balls remaining in the urn: 5 red and 2 blue.
The probability of drawing a red ball next is .
So, ."
:::
:::question type="NAT" question="For two events A and B, , , and . Calculate ." answer="0.5" hint="First, find using the formula for the probability of the union of two events." solution="We know the formula for the probability of the union of two events:
Substituting the given values:
Now, we can calculate the conditional probability using its definition:
."
:::
:::question type="MCQ" question="A fair coin is tossed twice. Let event E be 'the first toss is a head' and event F be 'the two tosses are the same (both heads or both tails)'. Are E and F independent?" options=["Yes, they are independent." ,"No, they are dependent." ,"Only if the coin is biased." ,"Cannot be determined without more information."] answer="Yes, they are independent." hint="List the sample space and the outcomes for E, F, and . Then check if ." solution="The sample space for two coin tosses is . Each outcome has a probability of .
Event E: 'the first toss is a head' .
.
Event F: 'the two tosses are the same' .
.
Event : 'the first toss is a head AND the two tosses are the same' .
.
To check for independence, we compare with :
.
Since (), events E and F are independent.
Therefore, the correct answer is 'Yes, they are independent.'."
:::
What's Next?
This chapter on conditional events forms a crucial foundation for more advanced topics in probability. The principles learned here are directly applied in understanding Bayes' Theorem, which deals with updating probabilities based on new evidence. Furthermore, a solid grasp of conditional probability is essential before delving into the study of random variables, their probability distributions (e.g., binomial, Poisson, normal), and the concepts of expectation and variance, which are central to statistical inference and modeling.