Temporal knowledge graph completion (TKGC) focuses on inferring missing facts from temporal knowledge graphs (TKGs) and has been widely studied. While previous models based on graph neural networks (GNNs) have shown noteworthy outcomes, they tend to focus on designing complex modules to learn contextual representations. These complex solutions require a large number of parameters and heavy memory consumption. Additionally, existing TKGC approaches focus on exploiting static feature representation for entities and relationships, which fail to effectively capture the semantic information of contexts. In this paper, we propose a global and local information-aware relational graph convolutional neural network (GLARGCN) model to address these issues. First, we design a sampler, which captures significant neighbors by combining global historical event frequencies with local temporal relative displacements and requires no additional learnable parameters. We then employ a time-aware encoder to model timestamps, relations, and entities uniformly. We perform a graph convolution operation to learn a global graph representation. Finally, our method predicts missing entities using a scoring function. We evaluate the model on four benchmark datasets and one specific dataset with unseen timestamps. The experimental results demonstrate that our proposed GLARGCN model not only outperforms contemporary models but also shows robust performance in scenarios with unseen timestamps.