Okay, let's break this down step-by-step:
- Planes A and B start from the same place and move in different directions.
- Plane B moves at 300 miles per hour.
- After 1 hour, planes A and B are 200 miles apart.
To determine the speed of plane A, we can set up a simple equation relating their speeds and the distance between them after 1 hour:
Speed of plane B = 300 mph
Distance between planes after 1 hour = 200 miles
So if we let v = speed of plane A in mph, we have:
300 + v = 200
v = 100
Therefore, the speed of plane A is 100 miles per hour.
Does this make sense? Let me know if you have any other questions!