12.01 minutes
time = distance / speed
where distance is the distance traveled, speed is the average speed of the travel, and time is the time taken to cover the distance.
We are given that the distance traveled is 11 miles and the average speed is 55 miles per hour. To convert the speed to miles per minute, we can divide it by 60 since 1 hour is equal to 60 minutes:
55 miles per hour = 55/60 miles per minute
= 0.9167 miles per minute (rounded to four decimal places)
Now we can substitute the values into the formula:
time = distance / speed
= 11 miles / 0.9167 miles per minute
≈ 12.01 minutes (rounded to two decimal places)
Therefore, it would take approximately 12.01 minutes to travel 11 miles at an average speed of 55 miles per hour.
*IG:whis.sama_ent