# Segment 1. Let's Talk about Probability

## Contents

#### Watch this segment

(Don't worry, what you see out-of-focus below is not the beginning of the segment. Press the play button to start at the beginning and in-focus.)

{{#widget:Iframe |url=http://www.youtube.com/v/H5WjVgL6Nh4&hd=1 |width=800 |height=625 |border=0 }}

The direct YouTube link is http://youtu.be/H5WjVgL6Nh4

Links to the slides: PDF file or PowerPoint file

#### Bill's comments on this segment

Well, I do sound nervous! This was one of my first webcasts. The production values get a little better with later segments. However, the material here is important, so be sure you understand it before going on.

Here is a link to the paper by R.T. Cox, discussed on slide 2. It's surprisingly readable for something so fundamental.

### Problems

#### To Calculate

1. Prove that <math>P(ABC) = P(B)P(C|B)P(A|BC)</math>.

2. What is the probability that the sum of two dice is odd with neither being a 4?

#### To Think About

1. First-order logic is a type of propositional calculus with propositions <math>a,b,c</math> and quantifier symbols <math>\forall</math> and <math>\exists</math>. This allows statements like "Socrates is a philosopher", "Socrates is a man", "There exists a philosopher who is not a man", etc. Can you use first-order logic as a calculus of inference? Is it the same as using the probability axioms? If not, then which of Cox's suppositions is violated?

2. You are an oracle that, when asked, says "yes" with probability <math>P</math> and "no" with probability <math>1-P</math>. How do you do this using only a fair, two-sided coin?

3. For the trout/minnow problem, what if you want to know the probability that the Nth fish caught is a trout, for N=1,2,3,... What is an efficient way to set up this calculation? (Hint: If you ever learned the word "Markov", this might be a good time to remember it!)