27.5k views
0 votes
A good radiograph was made using source-to-film distance of 16 inches and an exposure of 20 seconds. If the source to film distance is decreased to 8 inches what would the correct exposure time be?

User Mahi
by
8.0k points

1 Answer

5 votes

Final answer:

To maintain the same radiographic density, the exposure time should be reduced by a factor of 4 when the source-to-film distance is halved, leading to a correct exposure time of 5 seconds.

Step-by-step explanation:

The question relates to the inverse square law in radiography which states that the intensity of radiation exposure is inversely proportional to the square of the distance from the source. When the source-to-film distance is halved from 16 inches to 8 inches, the radiation intensity at the film increases by a factor of 4 (since 16/8 = 2 and 2^2 = 4). Therefore, to maintain the same radiographic density, the exposure time needs to be reduced by a factor of 4. The correct exposure time would be the original exposure time divided by 4, resulting in 20 seconds / 4 = 5 seconds.

User Darren McAffee
by
7.2k points