Ask Question
28 July, 19:13

Radio signals travel at a rate of 3x10^8 meter per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 9.6*10^6 meters? (Hint: Time divide by speed.)

A) 3.2*10^2 seconds

B) 3.2*10^-2 seconds

C) 3.13 * 10^1seconds

D) 2.88*10^15 seconds

+3
Answers (1)
  1. 28 July, 21:32
    0
    Distance = velocity x time

    distance = 9.6 x 10^15 meters

    velocity = 3x10^8 meters/second

    time = distance / velocity = m / (m/s) = s
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Radio signals travel at a rate of 3x10^8 meter per second. How many seconds would it take for a radio signal to travel from a satellite to ...” in 📘 Mathematics if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers