209k views
4 votes
A car is driving down a highway at a speed of 65 miles per hour. To the nearest second, how long wili it take the car to drive 100 yards? ( 1 mile = 5280 feet, 1 yard = 3 feet)

User Ham Vocke
by
7.2k points

1 Answer

3 votes

Answer:

3.1468 sec

Step-by-step explanation:

We have given the speed of the car is 65 miles/hour

The distance traveled by the car = 100 yards

As it is given that 1 yard = 3 feet

So 100 yard =100×3=300 feet

As in question it is given that 1 mile =5280 feet

So 300 feet
=(300)/(5280)=0.0568miles

Now time taken
=(distance)/(speed)=(0.0568)/(65)=8.7412* 10^(-4)hour=3.1468sec

User Louzoid
by
7.6k points