20.0k views
0 votes
a marathon is roughly 26.2 miles long. which equation could be used to determine the time t, it takes to run a marathon as a function of the average speed ,s, of the number runner where t is in hours and s is in miles per hour?

1 Answer

4 votes
Well, depending on how many runners their are, you would divide that number by 26.2, and you would figure out what s will be 
User Jarin Udom
by
8.0k points

No related questions found

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.

9.4m questions

12.2m answers

Categories