Final answer:
By setting up an equation based on the runners' speeds and the time difference, we find that both the faster and slower runner covered 9 miles, making option B the correct answer.
Step-by-step explanation:
The question involves calculating the distance run by two runners with different speeds who start at the same place and run along the same trail. The key to solving this problem is to set up an equation based on the relative speeds and the time difference when they finish.
Let's denote the distance both runners cover as d. The faster runner, averaging 8 miles per hour, will take d/8 hours to complete the distance, while the slower runner, averaging 6 miles per hour, will take d/6 hours. The slower runner takes three-quarters of an hour longer to finish, so we can write the equation:
d/6 - d/8 = 3/4
By finding a common denominator and solving for d, we can determine the distance:
8d - 6d = 8 * 3/4 * 6
2d = 18
d = 9
Both runners ran a distance of 9 miles, which corresponds to option B when we look at the given choices. Thus, the faster runner ran 9 miles, and the slower runner ran 6 miles.