monte Carlo Integration

Monte Carlo Integration - May 26, 2009

Today's lecture completes the discussion on the Frequentists and Bayesian schools of thought and introduces Basic Monte Carlo Integration.

Frequentist vs Bayesian Example - Estimating Parameters

Estimating parameters of a univariate Gaussian:

Frequentist: $f(x|\theta)=\frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}*(\frac{x-\mu}{\sigma})^2}$
Bayesian: $f(\theta|x)=\frac{f(x|\theta)f(\theta)}{f(x)}$

Frequentist Approach

Let $X^N$ denote $(x_1, x_2, ..., x_n)$. Using the Maximum Likelihood Estimation approach for estimating parameters we get:

$L(X^N; \theta) = \prod_{i=1}^N \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}(\frac{x_i- \mu} {\sigma})^2}$
$l(X^N; \theta) = \sum_{i=1}^N -\frac{1}{2}log (2\pi) - log(\sigma) - \frac{1}{2} \left(\frac{x_i- \mu}{\sigma}\right)^2$
$\frac{dl}{d\mu} = \displaystyle\sum_{i=1}^N(\frac{x_i-\mu}{\sigma^2})$

Setting $\frac{dl}{d\mu} = 0$ we get

$\displaystyle\sum_{i=1}^Nx_i = \displaystyle\sum_{i=1}^N\mu$
$\displaystyle\sum_{i=1}^Nx_i = N\mu \rightarrow \mu = \frac{1}{N}\displaystyle\sum_{i=1}^Nx_i$
Bayesian Approach

Assuming the prior is Gaussian:

$P(\theta) = \frac{1}{\sqrt{2\pi}\tau}e^{-\frac{1}{2}(\frac{x-\mu_0}{\tau})^2}$
$f(\theta|x) \propto \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}(\frac{x_i-\mu}{\sigma})^2} * \frac{1}{\sqrt{2\pi}\tau}e^{-\frac{1}{2}(\frac{x-\mu_0}{\tau})^2}$

By completing the square we conclude that the posterior is Gaussian as well:

$f(\theta|x)=\frac{1}{\sqrt{2\pi}\tilde{\sigma}}e^{-\frac{1}{2}(\frac{x-\tilde{\mu}}{\tilde{\sigma}})^2}$

Where

$\tilde{\mu} = \frac{\frac{N}{\sigma^2}}{{\frac{N}{\sigma^2}}+\frac{1}{\tau^2}}\bar{x} + \frac{\frac{1}{\tau^2}}{{\frac{N}{\sigma^2}}+\frac{1}{\tau^2}}\mu_0$

The expectation from the posterior is different from the MLE method. Note that $\displaystyle\lim_{N\to\infty}\tilde{\mu} = \bar{x}$. Also note that when $N = 0$ we get $\tilde{\mu} = \mu_0$.

Basic Monte Carlo Integration

$I = \displaystyle\int_a^b h(x)\,dx$
$= \displaystyle\int_a^b w(x)f(x)\,dx$

where

$w(x) = h(x)(b-a)$
$f(x) = \frac{1}{b-a} \rightarrow$ the p.d.f. is Unif$(a,b)$
$\hat{I} = E_f[w(x)] = \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i)$

If $x_i \sim~ Unif(a,b)$ then by the Law of Large Numbers $\frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i) \rightarrow \displaystyle\int w(x)f(x)\,dx = E_f[w(x)]$

Process

Given $I = \displaystyle\int^b_ah(x)\,dx$

1. $\begin{matrix} w(x) = h(x)(b-a)\end{matrix}$
2. $\begin{matrix} x_1, x_2, ..., x_n \sim UNIF(a,b)\end{matrix}$
3. $\hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i)$

From this we can compute other statistics, such as

1. $SE=\frac{s}{\sqrt{N}}$ where $s=\frac{\sum_{i=1}^{N}(Y_i-\hat{I})^2 }{N-1}$ with $\begin{matrix}Y_i=w(i)\end{matrix}$
2. $\begin{matrix} 1-\alpha \end{matrix}$ CI's can be estimated as $\hat{I}\pm Z_\frac{\alpha}{2}*SE$

Example 1

Find $E[\sqrt{x}]$ for $\begin{matrix} f(x) = e^{-x}\end{matrix}$

1. We need to draw from $\begin{matrix} f(x) = e^{-x}\end{matrix}$
2. $\hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i)$

This example can be illustrated in Matlab using the code below:

u=rand(100,1);
x=-log(u);
h= x.* .5;
mean(h)
%The value obtained using the Monte Carlo method
F = @ (x) sqrt (x). * exp(-x);
%The value of the real function using Matlab


Example 2 Find $I = \displaystyle\int^1_0h(x)\,dx, h(x) = x^3$

1. $I = x^4/4 = 1/4$
2. $w(x) = x^3*(1-0)$
3. $Xi \sim~Unif(0,1)$
4. $\hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^Nx_i^3$

This example can be illustrated in Matlab using the code below:

x = rand (1000,1);
mean(x.^3)


Example 3 To estimate an infinite integral such as $I = \displaystyle\int^\infty_5 h(x)\,dx, h(x) = 3e^{-x}$

1. Substitute in $y=\frac{1}{x-5+1} =\gt dy=\frac{1}{-(x-4)^2}dx =\gt dy=-y^2dx$
2. $I = \displaystyle\int^1_0 \frac{3e^{-(\frac{1}{y}+4)}}{y^2}\,dy$
3. $w(y) = \frac{3e^{-(\frac{1}{y}+4)}}{-y^2}(1-0)$
4. $Y_i \sim~Unif(0,1)$
5. $\hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^N(\frac{3e^{-(\frac{1}{y_i}+4)}}{y_i^2})$