Ask Question
3 June, 17:02

A radio signal travels at 3.00 • 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 • 10^7 meters? Show your work.

would the answer be rate=3*10^8m/s

distance=3.6*10^7m

time=distance/rate=3.6/30?

+4
Answers (1)
  1. 3 June, 18:01
    0
    The answer is 0.118 seconds.

    The velocity (v) is the distance (d) divided by time (t):

    v = d : t

    It is given:

    v = 3.00 * 10⁸ meters per second

    d = 3.54 * 10⁷ meters

    It is unknown:

    t = ?

    If:

    v = d : t

    Then:

    t = d : v

    t = 3.54 * 10⁷ meters : 3.00 * 10⁸ meters/second

    t = 1.18 * 10⁻¹ seconds

    t = 0.118 seconds

    Therefore, radio signal will travel from a satellite to the surface ofEarth 0.118 seconds.
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “A radio signal travels at 3.00 • 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the ...” in 📘 Mathematics if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers