Abstract
PURPOSE: Dose and dose rate are both appropriate for estimating risk from internally deposited radioactive materials. We investigated the role of dose rate on lung cancer induction in Beagle dogs following a single inhalation of strontium-90 ((90)Sr), cerium-144 ((144)Ce), yttrium-91 ((91)Y), or yttrium-90 ((90)Y). As retention of the radionuclide is dependent on biological clearance and physical half-life a representative quantity to describe this complex changing dose rate is needed. MATERIALS AND METHODS: Data were obtained from Beagle dog experiments from the Inhalation Toxicology Research Institute. The authors selected the dose rate at the effective half-life of each radionuclide (DR(ef)). RESULTS: Dogs exposed to DRef (1-100 Gy/day) died within the first year after exposure from acute lung disease. Dogs exposed at lower DRef (0.1-10 Gy/day) died of lung cancer. As DR(ef) decreased further (<0.1 Gy/day (90)Sr, <0.5 Gy/day (144)Ce, <0.9 Gy/day (91)Y, <8 Gy/day (90)Y), survival and lung cancer frequency were not significantly different from control dogs. CONCLUSION: Radiation exposures resulting from inhalation of beta-gamma emitting radionuclides that decay at different rates based on their effective half-life, leading to different rates of decrease in dose rate and cumulative dose, is less effective in causing cancer than acute low linear energy transfer exposures of the lung.