115k views
3 votes
A small radio transmitter broadcasts in a 69 mile radius. If you drive along a straight line from a city 93 miles north of the transmitter to a second city 78 miles east of the transmitter, during how much of the drive will you pick up a signal from the transmitter?

User Trixo
by
3.1k points

1 Answer

1 vote

Answer:

See Explanation

Explanation:

According to the Question,

  • We have A small radio transmitter that broadcasts in a 69-mile radius. If you drive along a straight line from a city 93 miles north of the transmitter to a second city 78 miles east of the transmitter.

Thus,

The distance that you get reception is the length of the chord created by the intersection of the circle defining the edge of transmission and the line defining the car trip.

  • x2 + y2 = 69² this is the circle

And,

The Transmitter at the origin

City to the north at (0,93) & City to the east at (78,0)

the Slope is M=(-93/78)

Intercept is B= y - mx ⇒ 93 - (-93/78)(0) = 93

The equation of the line between the cities is y = (-93/78)x + 93

  • y = -93x/78 + 93 this is the line

Now, Solve the above two Equations

The intersection is gotten from the picture or solving:

x^2 + [(-93/78)*x + 93]^2 = 69^2

  • on solving we get, the points approximately are: (67.952,11.98 ) and (23.6277, 64.82)

Now,

From the Pythagorean theorem the total distance of the trip is:

d1 = √(93^2 + 78^2) ≈ 121.37miles

And the distance when the signal is picked up is:

d2 =√ [(67.952-23.627)^2 + (64.82 - 11.98)^2] ≈ 68.96 miles

You will pick up a signal from the transmitter in (d2/d1)*100 = 56% of the drive.

User Kestemont Max
by
3.5k points