Plant phenotyping, which involves measuring and analysing plant traits, has seen significant advances in recent years by integrating autonomous platforms and sophisticated sensor systems. In contrast to traditional methods, modern unmanned ground vehicles (UGVs) provide robust and accurate phenotyping capabilities by enabling close, detailed and continuous monitoring of crops under different environmental conditions. This study presents the configuration and validation of a multi-sensor platform (MSP) integrated with a UGV to improve plant phenotyping through advanced data fusion and co-registration techniques. The platform incorporates red, green, and blue channel (RGB), hyperspectral from visible light (VIS) and near-infrared light (NIR) spectrum, thermal sensors, and a three-dimensional (3D) light detection and ranging (LiDAR), all subjected to extensive calibration to ensure precise temporal and spatial alignment. Intrinsic calibration was applied, including correcting the spectral signatures of VIS and NIR. Additionally, timestamps were synchronised using the VIS sensor as the primary reference due to its central position and higher data acquisition frequency. Homography matrices were computed using checkerboard patterns for geometric alignment across sensors, and motion corrections accounted for UGV movement and ground sample distance. LiDAR point clouds were transformed into depth-maps (DMs) using radial basis function interpolation, enriching the spatial data for further analysis. The co-registered and synchronised MSP was tested for detecting Cercospora leaf spot (CLS) in sugar beet plants during a field experiment. Two models were implemented: (1) a soil and plant segmentation model based on the DeepLabV3+ architecture, achieving an F1-score of 0.85 and an accuracy of 0.95, and (2) a CLS severity scoring model using a custom convolutional neural network (CNN). The severity model, leveraging NIR and DM channels, achieved an F1- score of 0.7066, accuracy of 0.7104, and recall of 0.7167, with NIR wavelengths between 814nm and 851 nm contributing significantly to performance. These results highlight the importance of accurate data fusion and synchronisation in multi-sensor systems for plant phenotyping. Moreover, the study demonstrates the potential of integrating multiple sensors on a UGV for precision agriculture, thereby enhancing MSP effectiveness in crop monitoring and disease detection.