The paper is considering the following question: using principal component regression (PCR) or partial least squares regression (PLSR), how much data can be removed from X while retaining the original ability to predict Y? Two model reduction methods using similarity transformations are discussed, one giving projections of original loadings onto the column space of the fitted response matrix (Y) over cap (essentially the orthogonal signal correction (OSC) methods), and one giving projections of original scores onto the column space of the coefficient matrix h (essentially the net analyte signal (NAS) methods). The loading projection method gives model residuals that are orthogonal to Y and (Y) over cap, which is valuable in certain applications. The score projection method, on the other hand, gives model residuals that are orthogonal to (B) over cap, which is essential in other applications. It is shown that the reduced matrix X-Y(S) from the score projection method is a subset of the reduced Matrix X-Y(L) from the loading projection method. It therefore has the smallest Frobenius norm, and thus the smallest total column variance, assuming centered data. Copyright (C) 2007 John Wiley & Sons, Ltd.