Segment 2 Sanmit Narvekar

From Computational Statistics Course Wiki
Jump to navigation Jump to search

Segment 2

To Calculate

1. If the knight had captured a Gnome instead of a Troll, what would his chances be of crossing safely?

If the knight captures a gnome, the only bridge that is guaranteed to be safe is H3. Thus, we are interested in the probability that the knight is on H3 given that he has captured a gnome.


First we calculate the marginal, which will be useful in later calculations:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle P(Gnome) = \sum_i P(H_i) P(Gnome | H_i) = (1/5)(3/5) + (1/5)(4/5) + (3/5)(1) = 22/25 }

And now we apply Bayes theorem to calculate the probability of being on H3:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle P (H_3 | Gnome) = \frac{P (H_3) P(Gnome | H_3)}{P(Gnome)} = \frac{(3/5)(1)}{(22/25)} = 15/22 }


2. Suppose that we have two identical boxes, A and B. A contains 5 red balls and 3 blue balls. B contains 2 red balls and 4 blue balls. A box is selected at random and exactly one ball is drawn from the box. What is the probability that it is blue? If it is blue, what is the probability that it came from box B?

To calculate the marginal probability of obtaining a blue ball, we use Bayes rule and sum the probability of picking the blue ball in all possible boxes:

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle P( Blue) = \sum_{i \in \{A,B\}} P(Box_i) P(Blue | Box_i) = P(Box_A) P(Blue|Box_A) + P(Box_B) P(Blue | Box_B) = (1/2)(3/8) + (1/2)(4/6) = 25/48 }

To calculate the probability that a blue ball came from box B, we use Bayes rule (and conveniently reuse our calculation from the first part in the denominator):

Failed to parse (MathML with SVG or PNG fallback (recommended for modern browsers and accessibility tools): Invalid response ("Math extension cannot connect to Restbase.") from server "https://wikimedia.org/api/rest_v1/":): {\displaystyle P(Box_B | Blue) = \frac{P(Box_B) P(Blue | Box_B)}{P(Blue)} = \frac{(1/2)(4/6)}{(25/48)} = 16/25 }

To Think About

1. Do you think that the human brain's intuitive "inference engine" obeys the commutativity and associativity of evidence? For example, are we more likely to be swayed by recent, rather than older, evidence? How can evolution get this wrong if the mathematical formulation is correct?

I think this really depends on how much you trust the new evidence versus the old evidence (or your existing model). From a robotics standpoint, if your "sensors" are very noisy, you may want to be selective in the type and rate at which you incorporate new evidence. When it comes to people, it also depends on what model the new evidence is trying to modify. For evidence relating to vision, perception, etc., the newest evidence would be weighted the most (assuming you trust your eyes, etc.). However, if someone tried to provide evidence for something that contradicts your core beliefs, it would be much harder unless (1) you trust them or (2) the evidence is irrefutable.


Comments