Multitemporal polarimetric SAR (PolSAR) data can provide a unique insight into the temporal scattering characteristics of targets and highlight their dynamic changes over time, therefore supporting improved classification performance. Constrained by the complexities of satellite orbit control technology and the challenges associated with time-series PolSAR data acquisition, most prevailing methodologies rely solely on a single PolSAR image to tackle land coverage classification, inherently limiting their ability to generalize across diverse scenarios. To address this limitation, this work introduces a novel Hybrid Attention-GRU Transformer (HAG-Former) model, which harnesses the power of pixel-level temporal-polarimetric change analysis and captures the dynamic variations in polarization scattering properties, thereby enhancing classification robustness and versatility. In this approach, we seamlessly integrate a self-attention mechanism, a Gated Recurrent Unit (GRU), and a transformer encoder to delve into pixel-level changes in polarimetric features. Initially, the self-attention mechanism pinpoints crucial classification-aiding features, bolstering their significance. The weighted features are then fed into the GRU model, enhancing local temporal-polarimetric relationship insights. These relationships, coupled with significant features from the self-attention mechanism, are subsequently processed by the transformer encoder, unraveling global information. Furthermore, we employ a label smoothing loss function during training, mitigating the impact of sample imbalance on classification accuracy. To validate the effectiveness of our proposed methodology, we evaluated it on two benchmark datasets. The results demonstrate a notable enhancement in classification performance, achieving an overall accuracy improvement of 2.21% and 1.79% over the state-of-the-art. The code is available at https://github.com/Thomasakun/HAGFormer.