Answer: 36 seconds
Step-by-step explanation:
First, let's convert all measures to the same unit
Tiger speed = 50 mph
Cow speed = 25 mph
Cow distance = a quarter of a mile = 0.25 miles
Let's say that at time 0, the cow tiger will be at point 0 and the cow at point 0.25 h miles.
At time t, they will be at the same points.
distance tiger = distance cow
initial point + time*speed = initial point + time*speed
0 + t*50 = 0.25 + 25*t
50t = 0.25 + 25t
50t - 25t = 0.25
25t = 0.25
t = 0.25/25
t = 0.01 hour
since 1 hour = 3600 seconds
0.01 hour = 36 seconds
It will take 36 seconds the tiger to catch its prey.