Segment 1. Let's Talk about Probability

From Computational Statistics Course Wiki
Jump to: navigation, search

Watch this segment

(Don't worry, what you see out-of-focus below is not the beginning of the segment. Press the play button to start at the beginning and in-focus.)

The direct YouTube link is

Links to the slides: PDF file or PowerPoint file

Bill's comments on this segment

Well, I do sound nervous! This was one of my first webcasts. The production values get a little better with later segments. However, the material here is important, so be sure you understand it before going on.

Here is a link to the paper by R.T. Cox, discussed on slide 2. It's surprisingly readable for something so fundamental.


To Calculate

1. Prove that Failed to parse (unknown error): P(ABC) = P(B)P(C|B)P(A|BC) .

2. What is the probability that the sum of two dice is odd with neither being a 4?

To Think About

1. First-order logic is a type of propositional calculus with propositions Failed to parse (unknown error): a,b,c and quantifier symbols Failed to parse (unknown error): \forall and Failed to parse (unknown error): \exists . This allows statements like "Socrates is a philosopher", "Socrates is a man", "There exists a philosopher who is not a man", etc. Can you use first-order logic as a calculus of inference? Is it the same as using the probability axioms? If not, then which of Cox's suppositions is violated?

2. You are an oracle that, when asked, says "yes" with probability Failed to parse (unknown error): P and "no" with probability Failed to parse (unknown error): 1-P . How do you do this using only a fair, two-sided coin? As we did in class. Represent P as a binary number. Whenever

3. For the trout/minnow problem, what if you want to know the probability that the Nth fish caught is a trout, for N=1,2,3,... What is an efficient way to set up this calculation? (Hint: If you ever learned the word "Markov", this might be a good time to remember it!)

Class Activity