Ask Question
13 January, 13:55

A particular baseball pitcher throws a baseball at a speed of 39.1 m/s (about 87.5 mi/hr) toward home plate. We use g = 9.8 m/s2 and ignore air friction.

(a) Assuming the pitcher releases the ball 16.6 m from home plate and throws it so the ball is initially moving horizontally, how long does it take the ball to reach home plate?

+4
Answers (1)
  1. 13 January, 16:47
    0
    There is no acceleration in the horizontal direction (just g in the vertical), so we can use v = d/t, where v is velocity, d is distance and t is time. We can solve for time like so: t = d/v, we can plug in numbers (v is 39.1m/s completely in the horizontal direction, so no need to break it down with sin's and cos's, just plug it in) and we get t = (16.6m) / (39.1 m/s) = 0.42 s. Keep in mind it wouldn't fall far enough vertically to hit home plate (though we don't know the ball's initial height anyway), but would be in the air just above it. Cheers!
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “A particular baseball pitcher throws a baseball at a speed of 39.1 m/s (about 87.5 mi/hr) toward home plate. We use g = 9.8 m/s2 and ignore ...” in 📘 Physics if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers