Final answer:
To calculate the shortest distance between points X(70°N, 110°W) and Y on the Earth's surface, we can use the concept of great circle navigation. The correct answer is a) 1,852 nautical miles.
Step-by-step explanation:
To calculate the shortest distance between points X(70°N, 110°W) and Y on the Earth's surface, we can use the concept of great circle navigation. The shortest distance between two points on a sphere is along a great circle, which is the intersection of the sphere and a plane passing through its center.
Since XY is the diameter of the parallel of latitude 70°N, we know that the circle passing through points X and Y is the parallel of latitude 70°N. The circumference of a circle is given by the formula C = 2πr, where r is the radius. In this case, the radius is the distance from the center of the Earth to the parallel of latitude 70°N, which can be calculated using the formula r = R * cos(latitude), where R is the radius of the Earth.
Using the given latitude of 70°N, we can calculate the radius and then multiply it by the circumference of the Earth to find the shortest distance between X and Y in nautical miles. The correct answer is a) 1,852 nautical miles.