Segment 7

From Computational Statistics (CSE383M and CS395T)
Jump to navigation Jump to search

Calculation Problems

1. Prove the result of "mechanical way". <math> \begin{align} \Delta^2 &= < (x-a)^2 > \\ &= < x^2 -2ax + a^2 > \\ \frac{d{\Delta^2}}{da} &= 0 \\ 2 <a - x> &= 0\\ 2(a - <x>) &= 0\\ a &= <x> \end{align} </math>

2. Thought process while solving the problem:

  1. It is easier to construct a piecewise function to have a maximum at zero.
  2. To ensure that the function is a probability distribution, I decided to choose the function split at 0 to 1 and 1 to <math>\infin</math>
  3. In order that the <math>M_4</math> not exist, the function should contain the <math>-5</math> power of x in it's integral.

<math> p(x) = \begin{cases} 0, & \text{if } x \le 0\\ \frac34, & \text{if } 0 \le x \le 1 \\ \frac1{x^5}, & \text{if } 1 \le x \le \infin \\ \end{cases} </math>

<math> \begin{align} \int_{-\infin}^{\infin}p(x) &= 1\\ <x^3> &= \frac74 \\ <x^4> &= \text{does not exist} \end{align} </math>

3. Positives and Negatives about using Median over Mean.

Positives: Mean, due to it's sensitivities is skews the central tendency because of outliers. While median is a better estimate of the true valies. Negatives: Mean is a lot more sensitive to the data, that is, if the distribution is close to a normal distribution the mean would give the central tendency. But the median is not as efficient in a normal distribution.


Food for Thought Problems

Class Activity

Group Activity with Jin, Rcardenas, Lori

1. What does a joint uniform prior on w and b look like?

Let P(X,Y) be P(w=X,b=Y), it's uniform, so the probability is the same for any X,Y.

<math> P = \int_0^1\int_0^{1-X} P(X,Y)dXdY = 1</math>

We got <math> P = P(X,Y)* (X-\frac{X^2}2)|_0^1 = 1 \rightarrow P(X,Y) = 2</math>


2.Suppose we know that w=0.4, b = 0.3, and d = 0.3. If we watch N = 10 games, what is the probability that W = 3, B = 5, and D = 2?

<math> P(3,5,2) = \binom{10}{3} \cdot \binom{7}{5} \cdot w^3 \cdot b^5 \cdot d^3 = 0.0353</math>

3. For general w, b, d, W, B, D, what is P(W, B, D | w, b, d)?

<math> P(W,B,D|w,b,d) = \binom{N}{W} \cdot \binom{N-W}{B} \cdot w^W \cdot b^B \cdot d^D </math>

4. Applying Bayes, what is P(w, b, d | W, B, D)? What is the Bayes denominator?

<math> P(w,b,d | W, B, D) = \frac{P(W,B,D|w,b,d) \cdot P(w,b,d)}{\int_0^1 \int_0^{1-w} P(W,B,D|w,b,d) \cdot P(w,b,d)}=\frac{w^W \cdot b^B \cdot d^D}{\frac{W! \cdot B! \cdot D!}{(W+B+D+2)!}} </math>

The denominator is:

<math>\frac{W! \cdot B! \cdot D!}{(W+B+D+2)!}</math>

5. Using the data from last Friday, count the outcomes of the first N games and produce a visualization of the joint posterior of the win rates for N = 0, 3, 10, 100, 1000, and 10000.


An interesting observation is the more game we count, the smaller the possible posterior probability space we will get, which means we become more confident on the probability we estimate by counting more games.

N=0

N0.png

N=3

N3.png

N=10

N10.png

N=100

N100.png

N=1000

N1000.png

N=10000

N10000.png