Answer:
Christie is wrong
She should be dividing the distance by the time taken, not subtracting
Explanation:
The jogging rate in miles per hour is given by distance run ÷ time taken
For the first 10 miles it took 2 hours so her rate was 10/2 = 5 mph
For the first 15 miles, it took 3 hours so rate = 15/3 = 5mph
For the first 20 miles, it took 4 hours so rate = 20/4 = 5 mph
For every 1 hour Christie ran 5 miles so her average rate was constant at 5mph