Suppose for TCP’s TimeOut Interval estimation, EstimatedRTT is 4.0 at some point and subsequent measured RTTs all are 1.0. If the initial value of Deviation was 0.75, how long does it take before the TimeOut value falls below 4.0? Use α= 0.875, β = .125

Respuesta :

Answer:

We're examining TCP's timeout interval estimation. Initially, the Estimated (ERTT) is 4.0, with a deviation (Dev) of 0.75. Subsequent measured RTTs are consistently 1.0. The parameters α and β are given as 0.875 and 0.125 respectively. We're tasked with determining how long it takes for the timeout value to drop below 4.0.

Explanation:

:) trust me