stat341 / CM 361: Difference between revisions

From statwiki
Jump to navigation Jump to search
No edit summary
No edit summary
Line 3: Line 3:
Instructor: Ali Ghodsi  
Instructor: Ali Ghodsi  


<math>Insert formula here</math>
 


==Sampling (Generating Random numbers)==
==Sampling (Generating Random numbers)==
Line 10: Line 10:
Step 1: Draw <math> U~ \sim~ Unif [0,1] </math>. <br />
Step 1: Draw <math> U~ \sim~ Unif [0,1] </math>. <br />
Step 2: Compute <math> X = F^{-1}(U) </math>.<br />
Step 2: Compute <math> X = F^{-1}(U) </math>.<br />
'''Example:'''<br />
'''Example:'''<br />Suppose  we want to draw a sample from <math> f(x) =  \lambda e^{-\lambda x} </math> where <math>x>0</math>. <br />We need to first find <math>F(x)</math> and
Suppose  we want to draw a sample from <math> f(x) =  \lambda e^{-\lambda x} <\math>
then <math>F^{-1}</math>.<br />
<math> F(x) = \int^x_0 \theta e^{-\theta u} du = 1 - e^{-\theta x} </math> <br />
<math> F^{-1}(x) = \frac{-log(1-y)}{\theta} </math> <br />
Now we can generate our random
sample <math>i=1\dots n</math> from <math>f(x)</math> by:<br />
<math>1)\ u_i \sim UNIF(0,1) </math><br />
<math>2)\ x_i = \frac{-log(1-u_i)}{\theta} </math><br />
The <math>x_i</math> are now a random sample from <math>f(x)</math>. <br />
The major problem with this approach is that we have to find
<math>F^{-1}</math> and for many distributions it is too difficult to find the inverse of
<math>F(x)</math>.

Revision as of 08:45, 13 May 2009

Computational Statistics and Data Analysis is a course offered at the University of Waterloo
Spring 2009
Instructor: Ali Ghodsi


Sampling (Generating Random numbers)

Inverse Transform Method

Step 1: Draw [math]\displaystyle{ U~ \sim~ Unif [0,1] }[/math].
Step 2: Compute [math]\displaystyle{ X = F^{-1}(U) }[/math].
Example:
Suppose we want to draw a sample from [math]\displaystyle{ f(x) = \lambda e^{-\lambda x} }[/math] where [math]\displaystyle{ x\gt 0 }[/math].
We need to first find [math]\displaystyle{ F(x) }[/math] and then [math]\displaystyle{ F^{-1} }[/math].

[math]\displaystyle{  F(x) = \int^x_0 \theta e^{-\theta u} du = 1 - e^{-\theta x}  }[/math] 

[math]\displaystyle{ F^{-1}(x) = \frac{-log(1-y)}{\theta} }[/math]
Now we can generate our random sample [math]\displaystyle{ i=1\dots n }[/math] from [math]\displaystyle{ f(x) }[/math] by:

[math]\displaystyle{ 1)\ u_i \sim UNIF(0,1)  }[/math]
[math]\displaystyle{ 2)\ x_i = \frac{-log(1-u_i)}{\theta} }[/math]

The [math]\displaystyle{ x_i }[/math] are now a random sample from [math]\displaystyle{ f(x) }[/math].
The major problem with this approach is that we have to find [math]\displaystyle{ F^{-1} }[/math] and for many distributions it is too difficult to find the inverse of [math]\displaystyle{ F(x) }[/math].