Ask Question
12 May, 18:50

Suppose two hosts, A and B, are separated by 7,500 kilometers and are connected by a direct link of R = 10 Mbps. Suppose the propagation speed over the link is 2.5 x 10 8 meters/sec. Consider sending a large packet of 500,000 bits from Host A to Host B. How many milliseconds (ms) does it take before the receiver has received the entire 500,000-bit file?

+5
Answers (1)
  1. 12 May, 22:05
    0
    50 ms (milliseconds) will be taken in total to take the entire file by user.

    Explanation:

    In Computer networks, propagation delay is defined as the time in which a packet is sent from sender to receiver completely. It is computed by taking the ratio of link length and propagation speed.

    We know that:

    BDP (in bits) = total bandwidth (in bits/sec) * trip time (in sec)

    Now according to given condition we have:

    Bandwidth = R = 10 Mbps = 10,000,000 bps

    BDP = 500,000 bits

    For finding Propagation Delay:

    Propagation Delay = BDP / R

    Propagation Delay = 500,000/10,000,000 sec

    Propagation Delay = 0.05 sec

    Converting in milliseconds:

    Propagation Delay = 50 ms

    Hence. the delay would be 0.05 seconds and in milliseconds they will be equal to 50 ms
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Suppose two hosts, A and B, are separated by 7,500 kilometers and are connected by a direct link of R = 10 Mbps. Suppose the propagation ...” in 📘 Computers and Technology if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers