Hierarchical text classification is an essential task in natural language processing. Existing studies focus only on label hierarchy structure, such as building classifiers for each level of labels or employing the label taxonomic hierarchy to improve the hierarchy classification performance. However, these methods ignore issues with imbalanced datasets, which present tremendous challenges to text classification performance, especially for the tail categories. To this end, we propose Hierarchyaware-Bilateral-Branch-Network (HiBBN) to address this problem, where we introduce the bilateral-branch network and apply a hierarchy-aware encoder to model text representation with label dependencies. In addition, HiBBN has two network branches that cooperate with the uniform sampler and reversed sampler, which can deal with the data imbalance problem sufficiently. Therefore, our-model handles both hierarchical structural information and modeling of tail data simultaneously, and extensive experiments on benchmark datasets indicate that our model achieves better performance, especially for fine-grained categories.