Abstract
Accurate determination of molecular transition intensities is vital to quantum chemistry and metrology, yet even simple diatomic molecules have historically been limited to 0.1% accuracy. Here, we show that frequency-domain measurements of relative intensity ratios outperform absolute methods, achieving 0.003% accuracy using dual-wavelength cavity mode dispersion spectroscopy. Enabled by high-precision frequency metrology, this approach reveals systematic discrepancies with state-of-the-art ab initio calculations, exposing subtle electron correlation effects in the dipole moment curve. Applied to line-intensity ratio thermometry (LRT), our technique determines gas temperatures with 0.5 millikelvin statistical uncertainty, exceeding previous LRT precision by two orders of magnitude. These results redefine the limits of optical gas metrology and enable International System of Units-traceable measurements for applications from combustion diagnostics to isotopic analysis. Discrepancies of up to 0.02% in transition probability ratios challenge theorists to refine models, establishing intensity ratios as a paradigm in precision molecular physics.