71.2k views
1 vote
An airplane departs airport a at a heading of (n w). after traveling 320 miles, the airplane adjusts its course to (n w) and flies an additional 112 miles to reach airport b. diagram shows position of airport a and airport b. a ray, labeled 320 miles, begins at origin and extends toward north-west, and another ray, labeled 112 miles extends till airport b making 130 degrees angle. distance between airports a and b is x. the distance between airports a and b is approximately miles.

2 Answers

2 votes

Final answer:

To find the distance between airports A and B, we can use the concept of vectors and calculate the magnitude of the resultant vector C, which is the sum of vectors A and B.

Step-by-step explanation:

Physics:

To find the distance between airports A and B, we can use the concept of vectors. Let's represent the first leg of the trip with vector A (320 miles at a heading of NW) and the second leg of the trip with vector B (112 miles at a heading of NW). To find the distance between A and B, we need to calculate the magnitude of the resultant vector C, which is the sum of vectors A and B. Using vector addition, we can calculate the magnitude of C as follows:

  • Magnitude of vector A = 320 miles
  • Magnitude of vector B = 112 miles
  • Angle between vectors A and B = 130 degrees
  • Magnitude of vector C = √((320^2) + (112^2) - (2 * 320 * 112 * cos(130)))

By substituting the values into the equation, we can calculate the distance between airports A and B approximately.

User BaSsGaz
by
8.6k points
3 votes

The distance between airports A and B is approximately 401.18 miles

How to determine the distance between airports A and B?

To find the distance between airports A and B, we can use the law of cosines since we have a triangle formed by the initial path (320 miles), the additional path (112 miles), and the angle between them when the airplane adjusts its course.

Let's denote the distance between airports A and B as 'd'. Using the law of cosines:


\[ d^2 = 320^2 + 112^2 - 2 * 320 * 112 * \cos(180\textdegree - (350\textdegree - 300\textdegree) \]

First, let's find the angle between the two legs of the triangle:


\[ 180\textdegree - (350\textdegree - 300\textdegree) = 180\textdegree - 50\textdegree = 130\textdegree \]

Now, substitute this into the law of cosines equation:


\[ d^2 = 320^2 + 112^2 - 2 * 320 * 112 * \cos(130\textdegree) \]

Calculate the cosine of 130\textdegree:


\[ \cos(130\textdegree) \approx -0.6428 \]

Now, substitute this into the equation:


\[ d^2 = 320^2 + 112^2 - 2 * 320 * 112 * (-0.6428) \]


\[ d^2 = 102400 + 12544 + 46006.848 \]


\[ d^2 = 160950.848 \]

Finally, take the square root to find 'd':


\[ d \approx √(160950.848) \]


\[ d \approx 401.18 \]

Therefore, the distance between airports A and B is approximately 401.18 miles.

See the image below for complete question.

An airplane departs airport a at a heading of (n w). after traveling 320 miles, the-example-1
User Hackose
by
8.3k points