Answer:
5 minutes
Explanation:
To find the number of minutes it takes Jason to run a mile, we have to first find the rate at which he runs.
This is called Jason's speed. It is given as
![Speed = (distance)/(time)](https://img.qammunity.org/2021/formulas/mathematics/middle-school/v62hzlc8kvghgea2xf57ctbocncxfcajti.png)
![Speed = (440)/(75)\\ \\\\Speed = 5.86 yards/second](https://img.qammunity.org/2021/formulas/mathematics/middle-school/oxceak4dcnskh9izco9xlfw3gcy34ts2z3.png)
From the formula of speed, time is given as:
![time = (distance)/(speed)](https://img.qammunity.org/2021/formulas/mathematics/high-school/p7ztfiia75wvw8bqlnpfogme6vvzb26jzj.png)
Time taken for Jason to run 1 mile (1760 yards) will therefore be:
![time = (1760)/(5.86) \\\\\\time = 300.34 secs](https://img.qammunity.org/2021/formulas/mathematics/middle-school/un70vn52arpongw82zke6tzvldofcujqc6.png)
Converting this time to minutes yields:
60 seconds = 1 minute
300.34 equals:
![(300.34)/(60) = 5.0 minutes](https://img.qammunity.org/2021/formulas/mathematics/middle-school/kxi6sq2zn7m4kfu47k1paxx0ydxununoa5.png)
It takes Jason 5 minutes to run 1 mile.