Knowledge distillation is a method that transfers information from a larger network (i.e. the teacher) to a smaller network (i.e. the student), so that the student network can inherit the strong performance of the teacher network while maintaining its computational complexity within a relatively lower range. Currently, knowledge distillation has been widely applied to object detection field to mitigate the rapid expansion of the model size. In this paper, we propose an object detector based on knowledge distillation method. Meanwhile, directly mimicking the features of the teacher often fails to achieve the desired results due to the extra noise in the feature extracted by the student, which causes significant inconsistency and may even weaken the capability of the student. To address this issue, we utilize diffusion model to remove the noise so as to narrow the gap between the features extracted by the teacher and the student, improving the performance of the student. Furthermore, we develop a noise matching module that matches noise level in the student feature during the denoising process. Extensive experiments have been conducted on COCO and Pascal VOC to validate the effectiveness of the proposed method, in which our method achieves 40.0% mAP and 81.63% mAP respectively, while maintaining a frame rate of 27.3FPS, exhibiting the superiority of our model in both accuracy and speed.