We can calculate the average speed as the quotient between the distance and the time it took to cover that distance.
In this case, the distance is 8100 miles and the time is 15 hours.
We have to express the speed in feet per hour.
We will use the equivalency 1 mile = 5280 feet. To convert the units we will multiply the result by (5280 ft / 1 mile), which has a value of 1 and let us change the units without changing the value.
Then, we can calculate the average speed as:
![\begin{gathered} v=(d)/(t) \\ v=(8100mi)/(15h)\cdot(5280ft)/(1mi)=2851200(ft)/(h) \end{gathered}](https://img.qammunity.org/2023/formulas/mathematics/college/348i8k0gmj9qlojm8ufdizmhzcjyw8bz3x.png)
Answer: the average speed was 2,851,200 feet per hour.