91.0k views
1 vote
If an image of optimal radiographic density was acquired using a traditional film-screen system at 16 milliampere-seconds (mAs) and a source-image distance (SID) of 100 cm, what mAs is required to produce the same density at a new distance of 50 cm?

User Aseyboldt
by
8.0k points

1 Answer

5 votes

Final answer:

To maintain the same radiographic density when the source-image distance is halved from 100 cm to 50 cm, the milliampere-seconds must be increased to 64 mAs according to the Inverse Square Law.

Step-by-step explanation:

In this question, we are dealing with the Inverse Square Law in radiography, which states that the intensity of radiation is inversely proportional to the square of the distance from the source. Given an initial condition of radiographic exposure with certain milliampere-seconds (mAs) and source-image distance (SID), we can find the new mAs required to maintain the same image density if the distance changes. Here, the original mAs is 16 at a distance of 100 cm, and the new distance is 50 cm.

The Inverse Square Law formula is I1/I2 = (D2/D1)^2, where I is the relative intensity, and D is the distance. However, for radiography, we are interested in maintaining the same exposure, which means the mAs must be adjusted when the distance changes. We can rewrite the formula to solve for mAs2, which gives us mAs1 * (D1/D2)^2 = mAs2. In our case, this becomes 16 mAs * (100/50)^2 = mAs2, which simplifies to 16 mAs * 4 = 64 mAs.

So, to produce the same radiographic density at a distance of 50 cm as was produced at 100 cm with an exposure of 16 mAs, the required mAs at 50 cm would be 64 mAs.

User Robert Benedetto
by
7.5k points