128k views
5 votes
Consider an institutional network accessing the internet via a 1 Mbps link. Suppose there are an average of 25 requests per second with the average size of the object returned being 40,000 bits. Suppose a local proxy server is installed with a 40% hit rate.

1)Describe the processing of HTTP queries from institutional users, including the use of conditional GET.
ii)What is the link utilisation prior to the cache being installed?
iii) What is the link utilisation after the cache is installed?

iv) Comment on the contribution to delay of the access link with and without the use of the cache. Assume that the round trip time delay dominates the transmission and local access delays.

User Naseeba C
by
8.5k points

1 Answer

7 votes

Final answer:

HTTP queries are processed by the server checking its cache first. Without a cache, the link utilization is 100%, but with a cache at a 40% hit rate, utilization drops to 60%. The cache significantly reduces network delay.

Step-by-step explanation:

The question relates to the analysis of HTTP queries in the context of a network structure with a local proxy server. To address part (i), an HTTP query is processed by the server, which checks its local cache first. If the requested object is there and up to date, a cached version is returned, potentially using a conditional GET to validate freshness. If not, the request is forwarded to the internet.

For part (ii), prior to installing the cache, the link utilization is calculated based on the number of requests and the average size of the objects. With 25 requests per second and each object being 40,000 bits, we have 25 x 40,000 bits/sec = 1,000,000 bits/sec, which equals the 1 Mbps link capacity, leading to full utilization.

For part (iii), after the cache is installed, the link utilization is decreased due to the 40% hit rate. Only 60% of requests would need to reach the internet, which is 0.6 x 1 Mbps = 0.6 Mbps.

Lastly, for part (iv), the cache contributes to a significant reduction in delay because local access to the cached data is much faster than retrieving it from the internet, especially when round trip time is the main cause of delay on the network.

User Hughes
by
8.2k points