Ask Question
23 March, 22:42

Radio signals travel at a rate of 3 * 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of the Earth if the satellite is orbiting at a height of 3.6 * 10*7 meters?

A.) 8.3 seconds

B.) 1.2 * 10^-1 seconds

C.) 1.08 * 10^16 seconds

D.) 10.8 * 10^15 seconds

+1
Answers (1)
  1. 23 March, 23:29
    0
    In this problem we are given with the speed of the signal and the distance travelled and is asked to determine the time of travel. IN this case, the expression that can be applied is

    time = distance / speed

    = (3.6 * 10⁷ m) / (3 * 10⁸ m/s)

    = (3.6 / 3) * 10⁷⁻⁸ s

    = 1.2 * 10⁻¹ s

    t = 0.12 s
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Radio signals travel at a rate of 3 * 10^8 meters per second. How many seconds would it take for a radio signal to travel from a satellite ...” in 📘 Physics if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers