monte Carlo Integration

From statwiki
Revision as of 13:41, 3 June 2009 by Thargrav (talk | contribs)
Jump to navigation Jump to search

Monte Carlo Integration - May 26, 2009

Today's lecture completes the discussion on the Frequentists and Bayesian schools of thought and introduces Basic Monte Carlo Integration.

Frequentist vs Bayesian Example - Estimating Parameters

Estimating parameters of a univariate Gaussian:

Frequentist: [math]\displaystyle{ f(x|\theta)=\frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}*(\frac{x-\mu}{\sigma})^2} }[/math]
Bayesian: [math]\displaystyle{ f(\theta|x)=\frac{f(x|\theta)f(\theta)}{f(x)} }[/math]

Frequentist Approach

Let [math]\displaystyle{ X^N }[/math] denote [math]\displaystyle{ (x_1, x_2, ..., x_n) }[/math]. Using the Maximum Likelihood Estimation approach for estimating parameters we get:

[math]\displaystyle{ L(X^N; \theta) = \prod_{i=1}^N \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}(\frac{x_i- \mu} {\sigma})^2} }[/math]
[math]\displaystyle{ l(X^N; \theta) = \sum_{i=1}^N -\frac{1}{2}log (2\pi) - log(\sigma) - \frac{1}{2} \left(\frac{x_i- \mu}{\sigma}\right)^2 }[/math]
[math]\displaystyle{ \frac{dl}{d\mu} = \displaystyle\sum_{i=1}^N(\frac{x_i-\mu}{\sigma^2}) }[/math]

Setting [math]\displaystyle{ \frac{dl}{d\mu} = 0 }[/math] we get

[math]\displaystyle{ \displaystyle\sum_{i=1}^Nx_i = \displaystyle\sum_{i=1}^N\mu }[/math]
[math]\displaystyle{ \displaystyle\sum_{i=1}^Nx_i = N\mu \rightarrow \mu = \frac{1}{N}\displaystyle\sum_{i=1}^Nx_i }[/math]
Bayesian Approach

Assuming the prior is Gaussian:

[math]\displaystyle{ P(\theta) = \frac{1}{\sqrt{2\pi}\tau}e^{-\frac{1}{2}(\frac{x-\mu_0}{\tau})^2} }[/math]
[math]\displaystyle{ f(\theta|x) \propto \frac{1}{\sqrt{2\pi}\sigma}e^{-\frac{1}{2}(\frac{x_i-\mu}{\sigma})^2} * \frac{1}{\sqrt{2\pi}\tau}e^{-\frac{1}{2}(\frac{x-\mu_0}{\tau})^2} }[/math]

By completing the square we conclude that the posterior is Gaussian as well:

[math]\displaystyle{ f(\theta|x)=\frac{1}{\sqrt{2\pi}\tilde{\sigma}}e^{-\frac{1}{2}(\frac{x-\tilde{\mu}}{\tilde{\sigma}})^2} }[/math]

Where

[math]\displaystyle{ \tilde{\mu} = \frac{\frac{N}{\sigma^2}}{{\frac{N}{\sigma^2}}+\frac{1}{\tau^2}}\bar{x} + \frac{\frac{1}{\tau^2}}{{\frac{N}{\sigma^2}}+\frac{1}{\tau^2}}\mu_0 }[/math]

The expectation from the posterior is different from the MLE method. Note that [math]\displaystyle{ \displaystyle\lim_{N\to\infty}\tilde{\mu} = \bar{x} }[/math]. Also note that when [math]\displaystyle{ N = 0 }[/math] we get [math]\displaystyle{ \tilde{\mu} = \mu_0 }[/math].

Basic Monte Carlo Integration

We start with a simple example:

[math]\displaystyle{ I = \displaystyle\int_a^b h(x)\,dx }[/math]
[math]\displaystyle{ = \displaystyle\int_a^b w(x)f(x)\,dx }[/math]

where

[math]\displaystyle{ w(x) = h(x)(b-a) }[/math]
[math]\displaystyle{ f(x) = \frac{1}{b-a} \rightarrow }[/math] the p.d.f. is Unif[math]\displaystyle{ (a,b) }[/math]
[math]\displaystyle{ \hat{I} = E_f[w(x)] = \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i) }[/math]

If [math]\displaystyle{ x_i \sim~ Unif(a,b) }[/math] then by the Law of Large Numbers [math]\displaystyle{ \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i) \rightarrow \displaystyle\int w(x)f(x)\,dx = E_f[w(x)] }[/math]

Process

Given [math]\displaystyle{ I = \displaystyle\int^b_ah(x)\,dx }[/math]

  1. [math]\displaystyle{ \begin{matrix} w(x) = h(x)(b-a)\end{matrix} }[/math]
  2. [math]\displaystyle{ \begin{matrix} x_1, x_2, ..., x_n \sim UNIF(a,b)\end{matrix} }[/math]
  3. [math]\displaystyle{ \hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i) }[/math]

From this we can compute other statistics, such as

  1. [math]\displaystyle{ SE=\frac{s}{\sqrt{N}} }[/math] where [math]\displaystyle{ s=\frac{\sum_{i=1}^{N}(Y_i-\hat{I})^2 }{N-1} }[/math] with [math]\displaystyle{ \begin{matrix}Y_i=w(i)\end{matrix} }[/math]
  2. [math]\displaystyle{ \begin{matrix} 1-\alpha \end{matrix} }[/math] CI's can be estimated as [math]\displaystyle{ \hat{I}\pm Z_\frac{\alpha}{2}*SE }[/math]

Example 1

Find [math]\displaystyle{ E[\sqrt{x}] }[/math] for [math]\displaystyle{ \begin{matrix} f(x) = e^{-x}\end{matrix} }[/math]

  1. We need to draw from [math]\displaystyle{ \begin{matrix} f(x) = e^{-x}\end{matrix} }[/math]
  2. [math]\displaystyle{ \hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^Nw(x_i) }[/math]

This example can be illustrated in Matlab using the code below:

u=rand(100,1);
x=-log(u);
h= x.* .5;
mean(h)
%The value obtained using the Monte Carlo method
F = @ (x) sqrt (x). * exp(-x);
quad(F,0,50)
%The value of the real function using Matlab

Example 2 Find [math]\displaystyle{ I = \displaystyle\int^1_0h(x)\,dx, h(x) = x^3 }[/math]

  1. [math]\displaystyle{ I = x^4/4 = 1/4 }[/math]
  2. [math]\displaystyle{ w(x) = x^3*(1-0) }[/math]
  3. [math]\displaystyle{ Xi \sim~Unif(0,1) }[/math]
  4. [math]\displaystyle{ \hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^Nx_i^3 }[/math]

This example can be illustrated in Matlab using the code below:

x = rand (1000,1);
mean(x.^3)

Example 3 To estimate an infinite integral such as [math]\displaystyle{ I = \displaystyle\int^\infty_5 h(x)\,dx, h(x) = 3e^{-x} }[/math]

  1. Substitute in [math]\displaystyle{ y=\frac{1}{x-5+1} =\gt dy=\frac{1}{-(x-4)^2}dx =\gt dy=-y^2dx }[/math]
  2. [math]\displaystyle{ I = \displaystyle\int^1_0 \frac{3e^{-(\frac{1}{y}+4)}}{y^2}\,dy }[/math]
  3. [math]\displaystyle{ w(y) = \frac{3e^{-(\frac{1}{y}+4)}}{-y^2}(1-0) }[/math]
  4. [math]\displaystyle{ Y_i \sim~Unif(0,1) }[/math]
  5. [math]\displaystyle{ \hat{I} = \frac{1}{N}\displaystyle\sum_{i=1}^N(\frac{3e^{-(\frac{1}{y_i}+4)}}{y_i^2}) }[/math]

Importance Sampling and Monte Carlo Simulation - May 28, 2009