204k views
0 votes
A cat starts to walk straight across a 100-meter-long field to her favorite tree. After 20 m, a dog sees the cat and chases it across the field and up the tree. If the average speed of the running cat is 10 m/s, how much time did it take for the cat to get to the tree?

A. 0.8 s
B. 2 s
C. 8 s
D. 10 s

User Sange
by
8.9k points

1 Answer

3 votes

Answer:

8 seconds, Answer choice C.

Step-by-step explanation:

The information they give us about the speed of the cat, is from the point at which the dog started chasing it (that velocity being 10 m/s).

Notice that the actual distance the cat run is: 100 meters minus 20 meters (100 - 20 = 80 meters). Therefore, we have information on the distance covered by the cat (80 meters), and its speed (10 m/s), so we can use the definition of speed to find the time it took the cat to get to the tree:


speed=(distance)/(time) \\10 (m)/(s) = (80\,m)/(time)\\time=(80)/(10) s=8 \,s

Since all units for the physical quantities involved were given in the SI system, the answer comes also in the SI units of time: "seconds"

User Max Pattern
by
8.8k points