31.7k views
5 votes
Driving down a​ mountain, Bob Dean finds that he descends 5500feet in elevation by the time he is 3.6 miles​ (horizontally) away from the high point on the mountain road. Find the slope of his descent.​ (1 mile​ = 5280​ feet).

The slope is ________. ​(Type an integer or decimal rounded to two decimal places as​ needed.)

User Arika
by
7.0k points

1 Answer

3 votes

Final answer:

The slope of Bob Dean's descent down a mountain is calculated by taking the elevation change (5500 feet) and dividing it by the horizontal distance covered (3.6 miles, converted to 19008 feet), resulting in a slope of approximately 0.29.

Step-by-step explanation:

To find the slope of his descent, we need to calculate the ratio of the vertical change (rise) to the horizontal change (run). The descent in elevation is 5500 feet, and the horizontal distance covered is 3.6 miles. First, we need to convert miles to feet by multiplying 3.6 by 5280, which gives us 19008 feet.

The slope is then calculated by dividing the rise by the run:

Slope = Rise / Run = 5500 feet / 19008 feet

When we do the calculation, we get:

Slope = 0.2895 (rounded to two decimal places)

Therefore, the slope of Bob Dean's descent is approximately 0.29.

User Delphian
by
8.5k points