Fundamental Statistics Theory Notes (7)

Estimating parameters from a random sample

Recall for a random sample $x_1, x_2…x_n i.i.d$ from a PMF/PDF, $f(x)$, the joint PMF/PDF can be written:
$f(x_1,x_2…x_n)=\prod_{i=1}^nf(x_i)$
Usually, $f(x)$ will depend on some unknown parameter,$s$, like $\theta$.
Notation: $f(x|\theta)$, $\theta$ can be unkown mean or variance.

Goal: Construct estimators for $\theta$ based on $x_1, x_2…x_n$.

Definition:
Let $f(x_1,x_2…x_n)$ denote the joint PMF/PDF of $x_1,x_2…x_n$. The given observed data $X_1 =x_1, X_2 = x_2…X_n=x_n$, the likelihood function (as a function of $\theta$) is $L(\theta|x_1,x_2…x_n)=f(x_1,x_2…x_n|\theta)$, if they are i.i.d, the function is also equal to $\prod_{i=1}^nf(x_i|\theta).$

Ex. Gators won 3 games of 3. Let x denote # wins. Assume $x\sim Bin(3,p)$. p is the unknown parameter to be estimated from the data.
$$f(x|p)=\left(3\atop p \right)p^x (1-p)^{3-x},\ for\ x=0,1,2,3$$
Data:
$x=3 \Rightarrow L(p|x=3)=\left(3\atop 3\right)p^3(1-p)^{3-3}=p^3$, as a function of p, $0\leq p\leq 1$.

This tell us the probability of data (3 wins) for different value of p.
e.g. $\underbrace {L(\frac{1}{2}|x=3)}_{1/8}<\underbrace{L(\frac{3}{4}|x=3)}_{27/64}$, then $p=\frac{3}{4}$seems more plausible than $p=\frac{1}{2}$.
In this case, $p=1$ maximized the likelihood function.

Definition:
The maximum likelihood estimate (MLE) is the value of $\theta$ that maximized $L(\theta|x_1,x_2…x_n)$.
Interpretation:
The MLE of $\theta$ is the value $\theta$ that make the observed data the most likely to occur.

If you like my article, please feel free to donate!