Abstract
The ability of concrete to attenuate ionizing radiation intensity is assessed using its linear or mass attenuation coefficient. In this work, the broad-beam linear and mass attenuation coefficients of different types of soils and cements used for making concrete were measured at different photon energies (60-1333 keV), nearly spanning the diagnostic photon energy range, using a NaI detector. The mass attenuation coefficients of cement decreased from 0.133 ± 0.002 at 60 keV to 0.047 ± 0.003 at 1332.5 keV. For soils, the mass attenuation coefficient of those collected from the beach was the highest, decreasing from 0.176 ± 0.003 cm²/g at 60 keV to 0.054 ± 0.001 cm²/g at 1332.5 keV. Land soils had the least value, decreasing from 0.124 ± 0.002 cm²/g at 60 keV to 0.044 ± 0.003 cm²/g at 1332.5 keV. Limestone had smaller mass attenuation coefficients than the cement produced using it. The implication of the above is that for making concrete, beach sand should be preferred as the sand component of the concrete. Models of the form μ(L) = A(E) exp[B(E)ρ] and μ(m) = αln(E)+β are proposed for fitting the linear attenuation coefficient and mass attenuation coefficient data, respectively.