219k views
2 votes
Consider the following blood inventory problem facing a hospital. There is need for a rare blood type, namely, type AB, Rh negative blood. The demand D (in pints) over any 3-day period is given by:

PID=0) = 0.4, PID=2) = 0.2,

P(D= 1) = 0.3,

and

PID=3)=0.1.

Note that the expected demand is 1 pint, since E(D) =0.3(1) +0.2(2) + 0.1(3) = 1. Suppose that there are 3 days between deliveries. The hospital proposes a policy of receiving 1 pint at each delivery and using the oldest blood first. If more blood is required than is on hand, an expensive emergency delivery is made. Blood is discarded if it is still on the shelf after 21 days. Denote the state of the system as the number of pints on hand just after a delivery. Thus, because of the discarding policy, the largest possible state is 7.

a) Follow the 3-step process to model the problem as a Markov Chain. To receive full credit, include the transition probability diagram.

b) Find the steady-state probabilities of the state of the Markov chain

c) Use the steady-state probabilities to determine the probability that a pint of blood will need to be discarded during a 3-day period. (Hint: because the oldest blood is used first, a pint reaches 21 days only if the state was 7 and then D=0.

d) Use the steady-state probabilities to determine the probability that an emergency delivery will be needed during the 3-day period between regular deliveries.

1 Answer

4 votes

Final answer:

The blood inventory problem is a complex scenario that can be modeled using a Markov Chain to predict various outcomes, such as the probability of discarding blood or needing an emergency delivery. A comprehension of transition probabilities and steady-state calculations is crucial to address this situation effectively.

Step-by-step explanation:

Understanding the Blood Inventory Problem as a Markov Chain

The blood inventory problem can be modeled as a Markov Chain, where states represent the number of pints of blood on hand just after a delivery. Here, 'delivery' means the regular restocking of the blood, and the states range from 0 to 7 due to the discarding policy after 21 days.

Markov Chain Transition Probability Diagram

Unfortunately, without the ability to visualize, we can't provide a transition probability diagram. However, a diagram would show states (0 to 7) as nodes connected by directed edges representing the probabilities of going from one state to another (reminding that the probabilities depend on the demand D).

Steady-State Probabilities

To find the steady-state probabilities of the Markov chain, we need to solve a system of linear equations formed by the transition probabilities and the condition that the sum of probabilities equals 1.

Probability of Discarding Blood

The probability of discarding a pint of blood due to it reaching 21 days (the surplus scenario) involves looking at the steady-state probability for state 7 multiplying it by the probability of D=0 (because blood is discarded only when the demand is zero).

Emergency Delivery Probability

Similarly, the probability of needing an emergency delivery is tied to high-demand scenarios and entails summing the probabilities of the states where inventory is too low to meet demand without emergency intervention.

User Liangzr
by
7.5k points