markov Chain Definitions: Difference between revisions

From statwiki
Jump to navigation Jump to search
(Created page with '**Rough Work** Markov chains. Can be used to determine probability of a state after a certain number of transition steps. In a Homogeneous Markov Chain the transition matrix do...')
(No difference)

Revision as of 11:39, 9 June 2009

    • Rough Work**

Markov chains.

Can be used to determine probability of a state after a certain number of transition steps. In a Homogeneous Markov Chain the transition matrix does not change over time.

In an irreducible matrix the matrix is invertible and has no eigenvalues of 0. and at least one eigenvalue of 1.

For the estimation of the integral I(h(x)f(x))dx we wish for the markov chain to approach a distribution f(x), and compute 1/N SUM(h(x_i)where x_i's are generated.

example of pseudo random number sequence.