Explore BrainMass

Branching Process - Markov Chain

Markov chains. I am finding the continuous case a bit unintuitive. This problem concerns martingales and branching processes, but it is supposedly not hard. I just need a detailed solution to this problem, so that I can see what concepts are used to solve it.


Solution Preview

Please see the attached file.

A Markov chain in discrete time is irreducible, has state space I = { 0,1,2,?} and starts in 1. It is both a branching process and a martingale. Determine the probability of hitting 0.
Let {Xn , n = 0, 1, ..} with X0 =1 be a discrete time branching process. Here Xn denote the state of ...

Solution Summary

The solution shows that the probability of extinction in a Markov chain is certain if it is a branching process and martingale.