228k views
2 votes
A hollow cylindrical (inner radius = 1.0 mm, outer radius = 3.0 mm) conductor carries a current of 80 A parallel to its axis. This current is uniformly distributed over a cross section of the conductor. Determine the magnitude of the magnetic field at a point that is 2.0 mm from the axis of the conductor.

1 Answer

3 votes

Answer:

The magnetic field is
B = 3 mT

Step-by-step explanation:

From the question we are told that

The inner radius is
r_i = 1.00 mm =1*10^(-3) \ m

The outer radius is
r_2 = 3.00 \ mm = 3.0 *10^(-3) \ m

The distance from the axis of the conductor is
d =2.0 \ mm = 2.0 *10^(-3) \ m

The current carried by the conductor is
I = 80 A

According to Ampere's circuital law , the magnetic field at a point that is
r_3 from the axis of the conductor


B = (\mu_oI)/(2 \pi d ) [(d - r_1)/(r_2 -r_1) ]

Where
\mu_o is the permeability of free space with a value of
\mu_o = 4 \pi *10^(-7) N/A^2

substituting values


B = ((4 \pi *10^(-7))(80))/(2 * 3.142 * 2 *10^(=3) ) [((2^2 - 1 ^2 )*10^(-3))/((3^2 - 1^2) *10^(-3)) ]


B = 3 mT

User Espen Burud
by
3.5k points