Final answer:
The tiger will land 5.1 meters from the base of the rock.
Step-by-step explanation:
To find how far the tiger will land from the base of the rock, we can use the equation for horizontal distance:
distance = speed x time
The initial speed of the tiger is given as 3.4 m/s and we want to find the time it takes for the tiger to land. Since there is no horizontal acceleration, the time can be found using the equation:
time = height / (initial speed x cos(angle))
Substituting the given values, we have:
time = 5.1 m / (3.4 m/s x 1) = 1.5 s
Now, we can calculate the horizontal distance:
distance = speed x time
distance = 3.4 m/s x 1.5 s = 5.1 m
Therefore, the tiger will land 5.1 meters from the base of the rock.