Abstract
Soil toxicity resulting from either natural or anthropogenic heavy metal contamination was evaluated through a nitrifying bacteria bioassay focused on the inhibition of oxygen consumption. Every contaminated soil sample inhibited the nitrifying bacteria bioassay, with inhibition levels ranging from 71% to 100%. The optimal conditions for maximizing O(2) consumption during the test procedure were established as follows: a test culture volume of 1 mL, a soil sample weight of 1 g, a rotation rate of 100 revolutions per minute, and a reaction duration of 48 h. In low- or uncontaminated soils, oxygen consumption ranged from 3.2 mL to 3.0 mL from a headspace volume of 1 mL filled with O(2). In contrast, contaminated soils exhibited a lower range, with values between 0.1 mL and 1.0 mL. EC50 levels for NB O(2) consumption were: Cr(6+) 1.21 mg/kg; Cu(2+) 6.92 mg/kg; Ag(+) 8.38 mg/kg; As(3+) 8.99 mg/kg; Ni(2+) 10.35 mg/kg; Hg(2+) 11.01 mg/kg; Cd(2+) 31.33 mg/kg; Pb(2+) 129.62 mg/kg. Values for inherent test variability (CVi), variation resulting from the natural characteristics of soil (CVns), and minimal detectable difference (MDD) were found to range between 1.6% and 4.7%, 7.8% and 14.6%, and 2.9% and 5.9%, respectively. A 10% toxicity threshold was set as the maximal tolerable inhibition (MTI) for effective soil toxicity assessment. Nitrifying bacteria bioassays offer a fast, affordable, and user-friendly tool for real-time soil toxicity assessment, boosting soil health monitoring and ecosystem protection.