Multi-label learning (MLL) trains a classification model from multiple labelled datasets, where each training instance is annotated with a set of class labels simultaneously. Following the binary relevance MLL paradigm, a recently effective spirit is to constructing specific features for each label, instead of training over the original feature space. Existing label-specific methods, however, only consider the information from instance distributions, making the reconstructed features poorly discriminative. In this paper, we propose the generation of Label-spEcific feaTures by simultaneously exploring insTance distributions and fEatuRe distributions, and suggest a new method named Letter. Letter reconstructs two subsets of new features from the instance level and feature level, respectively. More concretely, from the instance level, Letter incorporates a sparse constraint, and from the feature level, we cluster the original features to construct new features as an extension. The combination of these two new feature subsets is the final set of label-specific features. Extensive experiments on a total of 14 benchmark datasets verify the competitive performance of Letter against the existing state-of-the-art MLL methods.