54.6k views
0 votes
Suppose that you are measuring the time to receive a segment. When an interrupt occurs, you read out the system clock in milliseconds. When the segment is fully processed, you read out the clock again. You measure 0 msec 270,000 times and 1 msec 730,000 times. How long does it take to receive a segment?

User Kassie
by
6.5k points

1 Answer

3 votes

Answer:

730 micro Seconds

Step-by-step explanation:

Measurement of Average Value

As we measured the average value, which is the sum of all values divided by total no. of values.

As there time is given for no. of repetitions, so we multiply the time with repetitions and divide it by NO. of total repetitions.

Calculation of Average Value

Total Time= 270000 + 730000 = 1000000

T1 = 270000

T2= 730000

Average = T1+T2/Total Time

= (270,000*0 + 730,000*1 msec)/1,000,000

= 730 micro Seconds

So, 730 micro sec is the Average time.

User David Kennell
by
6.9k points