44.8k views
0 votes
Suppose that the measured SampleRTT value is 200ms. Using the Jacobson/Karels Algorithm, compute the TCP TimeoutInterval. You can use α = 0.125 and β = 0.25. You can assume that the value of EstimatedRTT and DevRTT were 206ms and 5ms respectively just before this sample was obtained.

1 Answer

2 votes

Final answer:

The Jacobson/Karels algorithm is used to compute the TCP TimeoutInterval, which takes into account the most recent SampleRTT measurement along with the existing EstimatedRTT and DevRTT values. Calculations involve updating EstimatedRTT and DevRTT using alpha and beta factors and then combining them to determine the TimeoutInterval, resulting in a computed value of 226.3ms.

Step-by-step explanation:

The student has asked to compute the TCP TimeoutInterval using the Jacobson/Karels algorithm, given a measured SampleRTT of 200ms, an EstimatedRTT of 206ms, and a DevRTT of 5ms prior to the sample. The values for α (alpha) and β (beta) are 0.125 and 0.25, respectively.

The calculation steps are:

  • Update the EstimatedRTT: EstimatedRTT = (1-α)*EstimatedRTT + α*SampleRTT
  • EstimatedRTT = (1-0.125)*206 + 0.125*200 = 205.25ms
  • Update the DevRTT: DevRTT = (1-β)*DevRTT + β*|SampleRTT - EstimatedRTT|
  • DevRTT = (1-0.25)*5 + 0.25*|200 - 205.25| = 5.3125ms
  • Calculate the TimeoutInterval: TimeoutInterval = EstimatedRTT + 4*DevRTT
  • TimeoutInterval = 205.25 + 4*5.3125 = 226.3ms

The final computed TimeoutInterval using the Jacobson/Karels algorithm is 226.3ms.

User Wokena
by
8.5k points