209,978 views
16 votes
16 votes
A wire carries a current of 4.2 A at what distance from the wire does the magnetic field have a magnitude of 1.3×10^ -5 t

User Sagar Kulthe
by
3.0k points

1 Answer

15 votes
15 votes

Answer:

the distance is 6.46 cm.

Step-by-step explanation:

Given

current in the wire, I = 4.2 A

magnitude of the magnetic field, B = 1.3 x 10⁻⁵ T

The distance from the wire is determined by using Biot-Savart Law;


B = (\mu_o I)/(2\pi r) \\\\r = (\mu_o I)/(2\pi B)

Where;

r is the distance from the wire where the magnetic field is experienced


r = (\mu_o I)/(2\pi B)\\\\r = (4\pi * 10^(-7) * 4.2 )/(2\pi * 1.3 * 10^(-5))\\\\r = 0.0646 \ m\\\\r = 6.46 \ cm

Therefore, the distance is 6.46 cm.

User Halbano
by
2.9k points