Final answer:
For CSMA/CD networks, the frame size must be increased proportionally with the data rate to ensure collision detection. At 100 Mbps, the minimum frame size is 5120 bits. At 10 Gbps, it is 512000 bits.
Step-by-step explanation:
The correct answer is option (i) and (ii) regarding the minimum frame size needed for correct operation of the CSMA/CD process when the data rate is increased.
In CSMA/CD (Carrier Sense Multiple Access with Collision Detection) network protocols, the minimum frame size is directly related to the end-to-end signal propagation time across the network. It guarantees that the sender can detect a collision before the end of the transmission. This means that the minimum frame size must be large enough to allow the transmitting station to occupy the medium for a time period at least equal to twice the propagation delay time.
If the data rate increases but the size of the network remains constant, the propagation delay (assuming it's primarily a function of distance and medium) does not change. However, the frame will be transmitted faster at a higher data rate. To maintain the necessary transmission time for collision detection, the frame size must increase proportionally with the data rate.
So, if the original data rate of 10 Mbps required a minimum frame size of 512 bits to ensure collision detection:
(i) At 100 Mbps, which is 10 times the original speed, the minimum frame size would need to be 10 times larger, or 5120 bits.
(ii) At 10 Gbps, which is 1000 times the original speed, the minimum frame size would need to be 1000 times larger, or 512000 bits.