Q:

A ball is thrown upward with an initial velocity of 35 meters per second from a cliff that is 30 meters high. The height of the ball is given by the quadratic equation h=-4.9t^2+35t+30 where h is in meters and t in the time in seconds since the ball was thrown, find the time it takes the ball to hit the ground. Round you answer to the nearest tenth of a second.I understand the the equation is given to you, but I'm not certain how to figure it out. If someone could instruct me on HOW to get the correct answer instead of just giving me the answer, I would appreciate it.

Accepted Solution

A:
Answer:8 seconds.Step-by-step explanation:In this question, the motion of this ball is also described by the quadratic equation. Look how the parameter c= 30  is the cliff's height, and how b is equal to the velocity (b=35) are also part of the quadratic one. So instead of making use of two other auxiliary linear equations relating velocity and height (or space), we'll use just one, the quadratic one that was given:[tex]h=-4.9t^{2}+35t+30[/tex]Examining the equation, the time elapsed for the ball to reach the ground can be calculated when h=0. For the height is 0 when it reaches the ground.By doing so:[tex]h=-4.9t^{2}+35t+30\Rightarrow h=\frac{-35\pm \sqrt{35^{2}-4*-4.9*30}}{2(-4.9)}\Rightarrow h'= \frac{25-5\sqrt{37}}{7}\: \:h''=\frac{25+5\sqrt{37}}{7}\: h'\approx -0.77\: h''=7.91\approx 8[/tex]Let's discard the negative root for the answer. So h=8