Using explicit formulas for the information matrix of ML factor analysis under multivariate normal theory, gross and net information for estimating the parameters in a covariance structure gained by adding the associated mean structure are defined. It is proved that a necessary and sufficient condition for a non-null net information gain is that the identified mean structure is not saturated, provided that the factor mean is not a zero vector. Under this necessary and sufficient condition for the mean structure, asymptotic variances of the ML estimates far some covariance structure parameters will be reduced. in light of this Theorem, we discuss some recently proposed mean and covariance structure models. For same of these models, adding the associated mean structure will not improve the estimation of the parameters of interest. We argue that in such cases it may be better not to model the means. In other cases where the identified mean structure is not saturated modeling the means should be seriously considered. Using numerical examples, we illustrate that the reduction of asymptotic variances for factor loadings can be quire substantial when a mean structure is added. We also demonstrate similar numerical results for a two-group model. Limitations and applicability of our results are discussed.