monte Carlo Integration

From statwiki
Revision as of 11:28, 3 June 2009 by Jwalsh (talk | contribs) (Process: Fixed some errors in examples two and three. They were doing w(w(x_i)) rather than w(x_i) ...)
Jump to: navigation, search

Monte Carlo Integration - May 26, 2009

Today's lecture completes the discussion on the Frequentists and Bayesian schools of thought and introduces Basic Monte Carlo Integration.

Frequentist vs Bayesian Example - Estimating Parameters

Estimating parameters of a univariate Gaussian:

Frequentist: [math]f(x|\theta)=\frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}*(\frac{x-\mu}{\sigma})^2}[/math]
Bayesian: [math]f(\theta|x)=\frac{f(x|\theta)f(\theta)}{f(x)}[/math]

Frequentist Approach

Let [math]X^N[/math] denote [math](x_1, x_2, ..., x_n)[/math]. Using the Maximum Likelihood Estimation approach for estimating parameters we get:

[math]L(X^N; \theta) = \prod_{i=1}^N \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}(\frac{x_i- \mu} {\sigma})^2}[/math]
[math]l(X^N; \theta) = \sum_{i=1}^N -\frac{1}{2}log (2\pi) - log(\sigma) - \frac{1}{2} \left(\frac{x_i- \mu}{\sigma}\right)^2 [/math]
[math]\frac{dl}{d\mu} = \displaystyle\sum_{i=1}^N(\frac{x_i-\mu}{\sigma^2})[/math]

Setting [math]\frac{dl}{d\mu} = 0[/math] we get

[math]\displaystyle\sum_{i=1}^Nx_i = \displaystyle\sum_{i=1}^N\mu[/math]
[math]\displaystyle\sum_{i=1}^Nx_i = N\mu \rightarrow \mu = \frac{1}{N}\displaystyle\sum_{i=1}^Nx_i[/math]
Bayesian Approach

Assuming the prior is Gaussian:

[math]P(\theta) = \frac{1}{\sqrt{2\pi}\tau}e^{-\frac{1}{2}(\frac{x-\mu_0}{\tau})^2}[/math]
[math]f(\theta|x) \propto \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}(\frac{x_i-\mu}{\sigma})^2} * \frac{1}{\sqrt{2\pi}\tau}e^{-\frac{1}{2}(\frac{x-\mu_0}{\tau})^2}[/math]

By completing the square we conclude that the posterior is Gaussian as well:

[math]f(\theta|x)=\frac{1}{\sqrt{2\pi}\tilde{\sigma}}e^{-\frac{1}{2}(\frac{x-\tilde{\mu}}{\tilde{\sigma}})^2}[/math]

Where

[math]\tilde{\mu} = \frac{\frac{N}{\sigma^2}}{{\frac{N}{\sigma^2}}+\frac{1}{\tau^2}}\bar{x} + \frac{\frac{1}{\tau^2}}{{\frac{N}{\sigma^2}}+\frac{1}{\tau^2}}\mu_0[/math]

The expectation from the posterior is different from the MLE method. Note that [math]\displaystyle\lim_{N\to\infty}\tilde{\mu} = \bar{x}[/math]. Also note that when [math]N = 0[/math] we get [math]\tilde{\mu} = \mu_0[/math].

Basic Monte Carlo Integration

We start with a simple example:

[math]I = \displaystyle\int_a^b h(x)\,dx[/math]
[math] = \displaystyle\int_a^b w(x)f(x)\,dx[/math]

where

[math]w(x) = h(x)(b-a)[/math]
[math]f(x) = \frac{1}{b-a} \rightarrow[/math] the p.d.f. is Unif[math](a,b)[/math]
[math]\hat{I} = E_f[w(x)] = \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i)[/math]

If [math]x_i \sim~ Unif(a,b)[/math] then by the Law of Large Numbers [math]\frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i) \rightarrow \displaystyle\int w(x)f(x)\,dx = E_f[w(x)][/math]

Process

Given [math]I = \displaystyle\int^b_ah(x)\,dx[/math]

  1. [math]\begin{matrix} w(x) = h(x)(b-a)\end{matrix}[/math]
  2. [math] \begin{matrix} x_1, x_2, ..., x_n \sim UNIF(a,b)\end{matrix}[/math]
  3. [math]\hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i)[/math]

From this we can compute other statistics, such as

  1. [math] SE=\frac{s}{\sqrt{N}}[/math] where [math]s=\frac{\sum_{i=1}^{N}(Y_i-\hat{I})^2 }{N-1} [/math] with [math] \begin{matrix}Y_i=w(i)\end{matrix}[/math]
  2. [math]\begin{matrix} 1-\alpha \end{matrix}[/math] CI's can be estimated as [math] \hat{I}\pm Z_\frac{\alpha}{2}*SE[/math]

Example 1

Find [math] E[\sqrt{x}][/math] for [math]\begin{matrix} f(x) = e^{-x}\end{matrix} [/math]

  1. We need to draw from [math]\begin{matrix} f(x) = e^{-x}\end{matrix} [/math]
  2. [math]\hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i)[/math]

This example can be illustrated in Matlab using the code below:

u=rand(100,1)
x=-log(u)
h= x.* .5
mean(h)
%The value obtained using the Monte Carlo method
F = @ (x) sqrt (x). * exp(-x)
quad(F,0,50)
%The value of the real function using Matlab

Example 2 Find [math] I = \displaystyle\int^1_0h(x)\,dx, h(x) = x^3 [/math]

  1. [math] I = x^4/4 = 1/4 [/math]
  2. [math] w(x) = x^3*(1-0)[/math]
  3. [math] Xi \sim~Unif(0,1)[/math]
  4. [math]\hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^Nx_i^3[/math]

This example can be illustrated in Matlab using the code below:

x = rand (1000)
mean(x^3)

Example 3 To estimate an infinite integral such as [math] I = \displaystyle\int^\infty_5 h(x)\,dx, h(x) = 3e^{-x} [/math]

  1. Substitute in [math] y=\frac{1}{x-5+1} =\gt dy=\frac{1}{(x-4)^2}dx =\gt dy=-y^2dx [/math]
  2. [math] I = \displaystyle\int^1_0 \frac{3e^{-(\frac{1}{y}+4)}}{-y^2}\,dy [/math]
  3. [math] w(y) = \frac{3e^{-(\frac{1}{y}+4)}}{-y^2}(1-0)[/math]
  4. [math] Y_i \sim~Unif(0,1)[/math]
  5. [math]\hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^N(\frac{3e^{-(\frac{1}{y_i}+4)}}{-y_i^2})[/math]