markov Chain Definitions

From statwiki
Revision as of 11:39, 9 June 2009 by Mstuart (talk | contribs) (Created page with '**Rough Work** Markov chains. Can be used to determine probability of a state after a certain number of transition steps. In a Homogeneous Markov Chain the transition matrix do...')
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
    • Rough Work**

Markov chains.

Can be used to determine probability of a state after a certain number of transition steps. In a Homogeneous Markov Chain the transition matrix does not change over time.

In an irreducible matrix the matrix is invertible and has no eigenvalues of 0. and at least one eigenvalue of 1.

For the estimation of the integral I(h(x)f(x))dx we wish for the markov chain to approach a distribution f(x), and compute 1/N SUM(h(x_i)where x_i's are generated.

example of pseudo random number sequence.