24.8k views
0 votes
Part A: An archer shoots an arrow into the sky where the motion of the arrow can be modeled by the equation ()= −16^2+80 where t is time in seconds and f(t) is height in feet. Find how long it will take the arrow to hit the ground using an algebraic method. Show all of your reasoning/steps used with this algebraic method.

Part B: Using the information in Part A, find how high the arrow will go at its maximum height, and how long it takes for this to occur using an algebraic method. Show all of your reasoning used with this algebraic method.

Please help

User Sastrija
by
4.9k points

1 Answer

3 votes

Answer:

See work/explanation below.

Explanation:

Part A: Use the quintradic formula to solve this and get t = 30 seconds.

X = -4+/-sqrt(4^2 – 4(1)(-16))/2(-16) = 30

Part B: 80 is the height of the building the archer is standing on. So the arrow has to be at least 80 ft. So we will write the equaition of h>80. We don’t know the exact height since we lack the force of the archer’s arm. We can also graph this on a number line to prove it if you need to, but it doesn’t ask it. Time amount of time is 16 seconds, since you can see it is in the equation given originally.

User GregC
by
6.1k points