Abstract
Domain adaptation in agricultural settings has traditionally focused on 2D imagery, leaving a significant gap in the robust application of 3D sensing technologies for plant monitoring and classification. In this paper, we propose an adversarial unsupervised domain adaptation framework for 3D point cloud classification in agriculture, addressing the domain shift between controlled (Crops3D) and real-world (Pheno4D) datasets. Our approach leverages a PointNet-based feature extractor, a domain discriminator trained with a Gradient Reversal Layer (GRL), and an entropy minimization objective to ensure confident predictions on the unlabeled target domain. Extensive experiments demonstrate that our method achieves a classification accuracy of 97% on the target domain, with strong per-class F1 scores, despite significant sensor and environmental differences between datasets. We also evaluate model performance in real-time scenarios and discuss deployment feasibility on edge devices. This work highlights the potential of 3D domain adaptation in precision agriculture and paves the way for more generalizable plant phenotyping models.