Noninvasive ways of condition assessment and quality control during construction are critically needed. In the study reported here, the possibility of using the infrared thermography (IRT) technique to estimate the total porosity and the compressive strength of cement mortar was investigated. To assess the cooling characteristics of specimens and the porosity detection limit of IRT, mortar mixtures with a constant aggregate volume fraction and varying water-to-cement ratios of 0.25, 0.35, 0.45, 0.55, and 0.65 were heated and subjected to four different cooling durations. Thereafter, the relationships between specimen cooling characteristics, measured porosity, and compressive strength were evaluated. The compressive strength prediction efficiency of the proposed IRT model was also compared to that of the commonly used ultrasonic velocity (UPV)-strength prediction models. Results indicate that the differences in the residual temperature of specimens during cooling became pronounced with decreasing w/c ratio. Although a generally good fit with the Newton's Cooling function was obtained for all the cooling durations considered, 60 min was the best, giving a strong exponential relationship between thermal time constant and total porosity (R (2) value of 0.99). Furthermore, with R (2) values of 0.97-0.98 and standard error (SE) of 4.0-4.5 MPa, the compressive strength prediction accuracy of the IRT-strength models were superior to those of the UPV-strength models, which recorded 0.89-0.94 R (2) values and SE of 6.2-9.0 MPa for the same set of mixtures. These results suggest that besides serving as a quality control tool during new construction, the identification of delamination and appropriate area for core extraction during condition assessment of concrete structures are other inherent potentials of the IRT technique proposed in this study.