Final answer:
No, the long-term value of the probability distribution vector in a Markov chain does not depend on the particular starting value x(0), as it will converge to the same stationary distribution regardless of the initial state, given certain conditions such as ergodicity.
Step-by-step explanation:
The long-term behavior of a Markov chain is determined by its transition probabilities rather than by its initial state. The probability distribution vector in a Markov chain, assuming it has one, is called the stationary distribution and represents the steady-state distribution of states.
Once a Markov chain reaches its stationary distribution, this distribution is independent of the starting state x(0). In other words, regardless of where the Markov chain starts, it will converge to this same long-term distribution, given certain conditions like ergodicity (the chain is both aperiodic and positive recurrent).
However, in the short term, the initial state can significantly influence the probabilities of different states. But as time progresses, this influence diminishes and eventually becomes negligible. The concept of a Markov chain being memoryless, particularly when referring to an exponential random variable, suggests that the future progression of the states depends only on the current state and not on the sequence of events that preceded it.