Abstract
This paper provides an experimental evaluation of two current transformers' (CTs') accuracy performance under linear and nonlinear load scenarios. Four performance indicators-measured RMS current, total harmonic distortion (THD), ratio error, and phase error-were assessed across seven sample load scenarios in compliance with IEEE C57.13 and IEC 61,869 standards. The findings demonstrate that whereas CT accuracy is high under near-sinusoidal stimulation, it deteriorates with increasing waveform distortion and load current. Higher current levels exacerbate core nonlinearity and saturation, resulting in larger ratio and phase errors, while increased harmonic content increases THD and causes phase displacement between primary and secondary currents. In nonlinear circumstances, Actual current magnitudes are underestimated due to the constant attenuation of observed RMS values. This work's main contribution is a synchronized, time-domain experimental evaluation showing that phase error changes in tandem with ratio error and THD as functions of current magnitude and harmonic content. The limitations of sinusoidal-based CT accuracy classifications are revealed by this experimentally verified coupling between amplitude and angular inaccuracies. It also emphasizes the potential of sophisticated corrective techniques, such as real-time error detection, harmonic compensation, and artificial intelligence-based predictive modeling, to enhance CT reliability in distortion-rich power networks.