227k views
3 votes
The World Series is a best-of-seven tournament: the first team to win four games wins the series. Suppose that two teams, A and B are playing. In each game, with probability p, A wins, and with probability 1 - p, B wins.

(a) Let the state of the series be represented by the pair (a, b), where a is the number of games won by A, and b is the number of games won by B. Give the transition for the resulting Markov chain; make sure you also describe the space of feasible states.
(b) Using the chain in part (a), calculate the probability the series ends in five games. (Write a system of equations that, when solved, will yimatrixeld the desired answer; you need not solve the equations.)
(c) Could you have used a smaller state space to answer the question in part (b)?
(d) Calculate the expected length of the series. (As before you only need to write down the required system of equations.)
(e) Now suppose that teams can get momentum. In particular, suppose that the probability A wins the first game is p, while the probability B wins is 1 - p, as before. However, now if A won the last game, then the probability A wins the next game becomes q> p; while if B won the last game, then the probability A wins the next game becomes r < p. Explain how this changes the Markov chain you defined in part (a).

1 Answer

4 votes

Final answer:

The World Series is a best-of-seven tournament with a defined state space and transition probabilities for a Markov chain. The probability of the series ending in five games can be calculated using a system of equations. It would not be possible to use a smaller state space to answer this question. The expected length of the series can also be calculated using a system of equations. If the probabilities of winning change based on previous outcomes, the Markov chain would need to be modified.

Step-by-step explanation:

(a) The state of the series can be represented by the pair (a, b), where a is the number of games won by Team A and b is the number of games won by Team B. The feasible states are (0,0), (1,0), (1,1), (2,0), (2,1), (2,2), (3,0), (3,1), (3,2) and (3,3). The transition for the resulting Markov chain is as follows:

  1. If the current state is (a, b) and a < 3 and b < 3, then the next state is (a + 1, b) with probability p and (a, b + 1) with probability 1 - p.
  2. If the current state is (a, b) and a = 3, then the next state is (a, b + 1) with probability 1 - p.
  3. If the current state is (a, b) and b = 3, then the next state is (a + 1, b) with probability p.

(b) To calculate the probability that the series ends in five games, we need to find the probability of reaching the state (3,2) in exactly five steps. This can be done by solving a system of equations that describes the transition probabilities.

(c) No, a smaller state space would not be able to answer the question in part (b) because we need to consider all possible states that can occur during the series.

(d) To calculate the expected length of the series, we need to find the expected number of steps it takes to reach the absorbing state (3,3). This can also be done by solving a system of equations that describes the transition probabilities.

(e) If the probabilities of winning the next game change based on the outcome of the previous game, the Markov chain would need to be modified to include these probabilities as part of the transition probabilities. The chain would now have more states and the transition probabilities would depend on the outcome of the previous game.

User Jjimenez
by
6.8k points