a ball is thrown horizontally with an initial speed of 10ms from an 80 m cliff.how long does it take to reach the ground?
Given:
"h=80\\:\\rm m"
"g=9.8\\:\\rm m\/s^2"
"v_0=10\\:\\rm m\/s"
The time of motion is equal to the time of free falling:
"t=\\sqrt{\\frac{2h}{g}}=\\sqrt{\\frac{2*80}{9.8}}=4.0\\:\\rm s"
Comments
Leave a comment