Online monitoring is essential for enhancing in-process quality control in laser additive manufacturing (AM) by enabling the early detection of defects, thereby preventing build failures. However, considering the intricate interplay of cross-scale multiphysics phenomena within the laser AM process, acquiring comprehensive state information that entirely depicts the mechanism of the forming process remains challenging when relying solely on a single sensor. Moreover, existing monitoring strategies lack sufficient exploitation of the heterogeneous features embedded in various types of process state information. Hence, accurately predicting localized defects in the laser AM process remains challenging. This paper proposes a localized defect prediction method in the laser AM process based on a multisensor monitoring strategy and the multi-feature fusion convolutional neural network (MFFCNN). A coaxially integrated CCD camera and dichromatic pyrometer are utilized to capture the morphological and thermal dynamics of molten pool, enabling in-situ sensing within the zone of laser-material interaction. To establish dataset labels, internal defects are detected using X-ray CT scanning, enabling the extraction of defect information for accurate annotation. A multisensor monitoring strategy is proposed, which maps the multisensor-captured process data to local defects according to their spatiotemporal relationships. Furthermore, a sampling mode using a rolling time window is employed for continual online monitoring, and three window lengths (0.2 s, 0.3 s, and 0.4 s) were evaluated to determine the optimal one. Given the purpose of sufficiently exploiting the wealth quality-related information found in multisensor-captured process data of molten pool, we propose the MFFCNN, incorporating a feature extraction module, a feature fusion module, and a decision-making module aimed at achieving precise defect prediction. Four feature fusion techniques, including feature concatenation, feature overlay, feature maximum coupling, and feature mean coupling, are systematically investigated to ascertain the most suitable method. Experimental results show that a rolling time window of 0.4 s, using the feature overlay method to fuse two-dimensional process temperature and in-process images, achieved the highest average accuracy of 93.68 %. The effectiveness of the proposed approach is further validated through comparative analysis with a single-sensor-based method without feature fusion.