contestada

A particular baseball pitcher throws a baseball at a speed of 39.1 m/s (about 87.5 mi/hr) toward home plate. We use g = 9.8 m/s2 and ignore air friction.

(a) Assuming the pitcher releases the ball 16.6 m from home plate and throws it so the ball is initially moving horizontally, how long does it take the ball to reach home plate?

Respuesta :

There is no acceleration in the horizontal direction (just g in the vertical), so we can use v = d/t, where v is velocity, d is distance and t is time. We can solve for time like so: t = d/v, we can plug in numbers (v is 39.1m/s completely in the horizontal direction, so no need to break it down with sin's and cos's, just plug it in) and we get t = (16.6m)/(39.1 m/s) = 0.42 s. Keep in mind it wouldn't fall far enough vertically to hit home plate (though we don't know the ball's initial height anyway), but would be in the air just above it. Cheers!