Dynamic or sudden changes in various scenes may give rise to new objects. These new objects with limited annotated samples are susceptible to overfitting in deep learning. While few-shot object detection (FSOD) is effective with limited samples, current FSOD methods for remote sensing images still face specific challenges. The "pretraining-transfer" paradigm tends to forget the feature representations of base classes, impacting the learning process for novel classes during few-shot training. Furthermore, the presence of implicit objects in sparsely labeled instances of remote sensing images introduces erroneous supervisory information. To address these challenges, we propose an FSOD method that incorporates multiscale feature knowledge distillation and implicit object discovery, named MFKDIOD, which preserves the performance of base classes and mitigates the impact of implicit objects. Specifically, we first design a multiscale feature knowledge distillation (MFKD) module, which transfers the knowledge of base classes from a teacher network to a student network, enabling the student network to better retain the base class feature representations. Second, we design an implicit object discovery (IOD) module that utilizes both the teacher and student networks to discover implicit objects within the few-shot training data and generate pseudolabels. The code will be available at https://github.com/RS-CSU/MFKDIOD.