Ask Question
18 September, 11:43

A rock is thrown horizontally at a speed of 5.0 m/s from the top of a cliff 64.7 m high. The rock hits the ground 18.0 m from the base of the cliff. How would this distance change if the rock was thrown at 10.0 m/s?

+3
Answers (1)
  1. 18 September, 13:01
    0
    Refer to the diagram shown.

    Assume g = 9.8 m/s² and ignore air resistance

    When the rock is launched from a height of 64.7 m,

    u = 5.0 m/s, the horizontal velocity

    v = 0, the initial vertical velocity

    If the rock hits the ground 18.0 m from the base of the cliff, then the time of flight is

    t = (18.0 m) / (5.0 m/s) = 3.6 s

    The vertical distance traveled is

    s = (1/2) * (9.8 m/s²) * (3.6 s) ² = 63.504 m

    Because this distance is less than 64.7 m, ground level is slightly higher away from the base of the cliff. It is higher by

    64.7 - 63.504 = 1.196 m

    If the rock is thrown at 10 m/s, the time of flight remains the same because acceleration due to gravity is the same.

    Therefore the horizontal distance traveled is

    (10.0 m/s) * (3.6 s) = 36.0 m

    Answer: The distance will be 36.0 m
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “A rock is thrown horizontally at a speed of 5.0 m/s from the top of a cliff 64.7 m high. The rock hits the ground 18.0 m from the base of ...” in 📘 Physics if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers