Final answer:
A transition matrix of a Markov chain is true because each row, representing state transition probabilities, must sum to one to be considered a probability vector.
Step-by-step explanation:
The statement regarding the transition matrix for a Markov chain is true because each row must sum to one. Each row in a transition matrix represents the probabilities of transitioning from one state to the other states in the next step of the chain. Because these are probabilities, the sum of the probabilities in a single row must equal 1, which ensures that the total probability of transitioning from a given state to any of the possible states is certain. This is analogous to a probability vector, where each element represents the probability of a particular event, and all events are mutually exclusive and collectively exhaustive.