In multi-label learning, the training data is typically large-scale and contains numerous noisy and redundant instances. Directly inducing a classifier with raw data can result in higher memory overhead and lower classification performance. One effective method to alleviate these problems is prototype selection, which reduces the number of instances. However, most existing multi-label prototype selection algorithms transform the multi-label data set into a single-label one using problem transformation methods, which may ignore the label correlation and lead to suboptimal prototype selection. To overcome this limitation, we propose a new method called CO-GCNN, i.e., multi-label prototype selection with Co-Occurrence and Generalized Condensed Nearest Neighbor. The CO-GCNN represents label correlation by calculating the co-occurrence rate of pairwise labels and dividing the original data into positive and negative classes. Then, the prototype selection process is performed using the generalized condensed nearest neighbor rule to obtain a reduced set of instances. Experiments on six multi-label benchmark datasets show that the classifier derived from the reduced set outperforms the classifier derived from the original data, confirming the effectiveness of the proposed method.