Multi-view unsupervised feature selection has achieved great success in identifying a subset of prominent features from multi-view data to produce compact and meaningful representations. However, most existing methods assume that all data views are complete, which is often not the case in real-world scenarios. Multi-view data is frequently incomplete, with some instances missing in certain views. To address this issue, we propose an incomplete multi-view unsupervised feature selection model based on multiple space learning, termed Learning Missing Instances in Intact and Projection Spaces for Incomplete Multi-view Unsupervised Feature Selection (LIPS). This model integrates intact latent space learning, projection space learning, missing instance imputation, and correlation structure learning into a joint framework. Specifically, LIPS employs intact latent space learning to generate intact representations that capture the full information of multi-view data. Using these representations, LIPS calculates correlations between data through a constrained self-expression strategy, generating a sparse correlation matrix where each row contains few non-zero entries, signifying that each data point can be linearly reconstructed using only a small subset of related neighbors. Subsequently, LIPS projects data into low-dimensional spaces to retain the neighborhood correlations. Finally, it leverages complementary information to impute the missing instances from a cross-view perspective based on intact representations and utilizes neighborhood information to generate neighborhood-smooth imputations for missing instances from view-specific perspectives. Additionally, an effective algorithm is developed to resolve the optimization problem. Extensive experiments conducted on six public datasets of different types, including image datasets (MSRC-v1, Caltech101-7, and CIFAR-10), text datasets (BBCSport and WebKB), and a face dataset (Yale), measured by Acc and NMI, demonstrate that the proposed LIPS outperforms state-of-the-art methods.