Neural architecture search (NAS) automates the design of neural networks for specific tasks. Recently, zero-shot NAS has attracted much attention. Unlike traditional NAS, which relies on training to rank architectures, zero-shot NAS uses gradients or activation information to evaluate architecture performance. However, existing zero-shot NAS methods are limited by their inconsistent architecture ranking and the evaluation bias of their search algorithm, making it challenging to discover networks with high accuracy efficiently. To address this dilemma, this paper proposes an efficient and stable search framework for zero-shot NAS. Firstly, we design a stable zero-shot proxy, which achieves good consistency with network accuracy by utilizing filtered gradient information. On this basis, we employ a multi-fidelity evolutionary algorithm for efficient exploration. This algorithm utilizes multi-fidelity proxies to correct the bias towards certain types of networks and enhances the ability to distinguish high-performing architectures, achieving rapid convergence through performance-directed multi-point crossover and mutation. Experimental results conducted on NATS-Bench demonstrate that our framework can discover high-performance architectures within minutes of GPU time, outperforming existing training-free and training-based NAS methods. The code is available at https://github.com/mine7777/MFNAS.