Real-time and accurate information on fine-grained changes in crop cultivation is of great significance for crop growth monitoring, yield prediction and agricultural structure adjustment. Aiming at the problems of serious spectral confusion in visible high-resolution unmanned aerial vehicle (UAV) images of different phases, interference of large complex background and "salt-and-pepper" noise by existing semantic change detection (SCD) algorithms, in order to effectively extract deep image features of crops and meet the demand of agricultural practical engineering applications, this paper designs and proposes an agricultural geographic scene and parcelscale constrained SCD framework for crops (AGSPNet). AGSPNet framework contains three parts: agricultural geographic scene (AGS) division module, parcel edge extraction module and crop SCD module: (1) AGS division module uses multi-source open geographic data products to delineate AGS with relatively consistent geographic element conditions by analyzing the rule of agricultural territorial differentiation. (2) The parcel edge extraction module uses a bi-directional cascade network (BDCN) and a designed edge optimization model to obtain comprehensive AGS farm parcels. (3) The SCD module uses a designed criss-cross-attention network (CCNet) with pseudo-siamese structure and change feature discrimination module for extracting semantic features and change features of AGS crops, outputting accurate pixel-level semantic change maps, and then fusing the parcel extraction results with the semantic change maps to finally obtain parcel-scale fine-grained SCD results of crops. Meanwhile, we produce and introduce an UAV image SCD dataset (CSCD) dedicated to agricultural monitoring, encompassing multiple semantic variation types of crops in complex geographical scene. We conduct comparative experiments and accuracy evaluations in two test areas of this dataset, and the results show that the crop SCD results of AGSPNet consistently outperform other deep learning SCD models in terms of quantity and quality, with the evaluation metrics F1-score, kappa, OA, and mIoU obtaining improvements of 0.038, 0.021, 0.011 and 0.062, respectively, on average over the sub-optimal method. The method proposed in this paper can clearly detect the fine-grained change information of crop types in complex scenes, which can provide scientific and technical support for smart agriculture monitoring and management, food policy formulation and food security assurance.