In knowledge graphs (KGs), logic rules offer interpretable explanations for predictions and are essential for reasoning on downstream tasks, such as question answering. However, a key challenge remains unresolved: how to effectively encode and utilize the structural features around the head entity to generate the most applicable rules. This paper proposes a structure- aware graph transformer for rule learning, namely Structure-Aware Rule Learning (SARL), which leverages both local and global structural information of the subgraph around the head entity to generate the most suitable rule path. SARL employs a generalized attention mechanism combined with replaceable feature extractors to aggregate local structural information of entities. It then incorporates global structural and relational information to further model the subgraph structure. Finally, a rule decoder utilizes the comprehensive subgraph representation to generate the most appropriate rules. Comprehensive experiments on four real-world knowledge graph datasets reveal that SARL significantly enhances performance and surpasses existing methods in the link prediction task on large-scale KGs, with Hits@1 improvements of 6.5% on UMLS and 4.5% on FB15K-237.