Abstract
Lock-in thermography is a widely adopted infrared nondestructive testing technique that detects subsurface defects by applying modulated thermal waves and analyzing the resulting surface temperature variations. However, quantitatively characterizing subsurface defects at varying depths remains a significant challenge. This study explores the lateral resolution of subsurface defect detection using phase-based lock-in thermography, integrating analytical modeling, finite element simulation, and phase difference analysis. The results demonstrate that defect visibility and boundary definition are highly influenced by the excitation frequency. The thermal diffusion length, which is inversely proportional to the square root of the excitation frequency, governs both the penetration depth and the lateral spread of thermal energy. Higher frequencies enhance lateral resolution, whereas lower frequencies improve the detectability of deeper defects. Detection becomes particularly difficult for defects with small radii or low radius-to-depth ratios. A critical radius-to-depth threshold of 2 is identified as essential for reliable boundary delineation. These findings offer practical guidance for selecting excitation frequencies to achieve an optimal balance between depth sensitivity and lateral resolution in thermal-wave-based nondestructive evaluation.