<aside> <img src="https://s3-us-west-2.amazonaws.com/secure.notion-static.com/0b3e52d2-4c8a-468a-b546-b1709b5d74ef/law_of_large_number.png" alt="https://s3-us-west-2.amazonaws.com/secure.notion-static.com/0b3e52d2-4c8a-468a-b546-b1709b5d74ef/law_of_large_number.png" width="40px" /> Law of large number states that the sample mean tends to true mean as the number of samples tend to infinity

$$ \langle X_N \rangle = \frac{1}{N} \sum_{i=1}^{N} X_i \to \mu \quad {\rm for} \quad N \to \infty $$

</aside>

The central limit theorem

<aside> <img src="https://s3-us-west-2.amazonaws.com/secure.notion-static.com/734e3a6a-19d3-4477-b8b7-c2c4ad84048b/central_limit_theorem.png" alt="https://s3-us-west-2.amazonaws.com/secure.notion-static.com/734e3a6a-19d3-4477-b8b7-c2c4ad84048b/central_limit_theorem.png" width="40px" /> Central limit theorem states that if random samples are added, then the distribution of their sum should increasingly look like a gaussian distribution with a mean $\sum\mu_i$ and a variance $\sum\sigma_i^2$ as $N$ increases

$$ \lim_{N \to \infty} \sum_{i=1}^N X_i \xrightarrow{d} \mathcal{N}(\sum \mu_i, \sum \sigma_i^2) $$

</aside>

Parameter estimation

$$ \begin{aligned} &\bullet \text{ Continuous case: }\quad&E[a(x)]=\int_\Omega a(x)\mathcal L(x)\,\text dx \\ &\bullet \text{ Discrete case: }\quad&E[a(x)]=\sum_i^N a(x_N)\mathcal L(x_N) \end{aligned} $$

<aside> <img src="https://s3-us-west-2.amazonaws.com/secure.notion-static.com/26c3fb2e-e7a1-4bc8-ad5e-c43aca026c45/Estimator.png" alt="https://s3-us-west-2.amazonaws.com/secure.notion-static.com/26c3fb2e-e7a1-4bc8-ad5e-c43aca026c45/Estimator.png" width="40px" /> Estimator: procedure that is applied to the data sample and gives a numerical value for a parameter and/or a property of a parent population/distribution function

</aside>

$$ \begin{aligned} &\text{Consistent:}\quad &\lim_{N \to \infty} \widehat{a} = a \\ &\text{Unbiast:}\quad &E[\hat a]=a \\ &\text{Efficient:}\quad &\text{small variance}

\end{aligned} $$