Ask Question
26 June, 04:12

Radio signals travel at a rate of 3 * 108 meters per second. how many seconds would it take for a radio signal to travel from a satellite to the surface of earth if the satellite is orbiting at a height of 9.6 * 106 meters? (hint: time is distance divided by speed.)

+4
Answers (1)
  1. 26 June, 05:31
    0
    3.2x10^-2 seconds (0.032 seconds)

    This is a simple matter of division. I also suspect it's an exercise in scientific notation, so here is how you divide in scientific notation:

    9.6 x 10^6 m / 3x10^8 m/s

    First, divide the significands like you would normally.

    9.6 / 3 = 3.2

    And subtract the exponent. So

    6 - 8 = - 2

    So the answer is 3.2 x 10^-2

    And since the significand is less than 10 and at least 1, we don't need to normalize it.

    So it takes 3.2x10^-2 seconds for the radio signal to reach the satellite.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Radio signals travel at a rate of 3 * 108 meters per second. how many seconds would it take for a radio signal to travel from a satellite ...” in 📘 Physics if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers