Abstract
Accurate radiation thermometry of real objects critically depends on knowledge of surface emissivity, which is rarely known a priori and often varies with surface condition, temperature, and environment. Although theoretical models for spectral emissivity evaluation exist, their practical validation under application-relevant conditions remains limited. In this study, spectral and radiometric emissivity evaluation methods are compared on metallic samples up to 350 °C. The spectral method derives effective emissivity from spectroscopy-measured spectral emissivity using instrument-specific spectral sensitivity (responsivity), while the radiometric method evaluates emissivity directly from radiance measurements using a radiation thermometer and a reference contact temperature. The radiometric method is treated as an application-level reference. Stable and homogeneous chromium nitride (CrN)-coated samples show good agreement between the two methods, whereas raw metals and polysiloxane-coated samples highlight practical limitations related to sample surface instability and inhomogeneity. The results demonstrate that spectral emissivity evaluation is valid in practice when its underlying method assumptions are fulfilled, while radiometric evaluation remains preferable for in situ infrared thermometry.