The shortstop needs to throw the ball approximately 93.6 feet using the Pythagorean theorem, considering the distances between the shortstop and second base (25 feet) and between second base and first base (90 feet).
Find the distance the shortstop needs to throw the ball using the Pythagorean theorem:
Identify the right triangle:
Imagine the baseball field as a square with sides of 90 feet.
The shortstop, second base, and first base form a right triangle.
The distance the shortstop needs to throw is the hypotenuse of this triangle.
Identify the legs of the triangle:
One leg is the distance between the shortstop and second base, which is given as 25 feet.
The other leg is the distance between second base and first base, which is equal to one side of the square, so it's also 90 feet.
Apply the Pythagorean theorem:
The Pythagorean theorem states that in a right triangle, the square of the hypotenuse (c) is equal to the sum of the squares of the other two sides (a and b).
In this case, we have: c^2 = a^2 + b^2
Plug in the values and solve:
c^2 = 25^2 + 90^2 = 625 + 8100 = 8725
To find the value of c (the hypotenuse), take the square root of both sides: c = √8725
Calculate and round the answer:
√8725 ≈ 93.55
Round this to the nearest tenth: 93.6 feet
Therefore, the shortstop needs to throw the ball approximately 93.6 feet to reach first base.