In this article, we present an optimization method based on the hybridization of the genetic algorithm (GA) and gradient optimization (grad-opt) and facilitated by a physics-informed machine learning model. In the proposed method, the slow-but-global GA is used as a pre-screening tool to provide good initial values to the fast-but-local grad-opt. We introduce a robust metric to measure the goodness of the designs as starting points and use a set of control parameters to fine tune the optimization dynamics. We utilize the machine learning with analytic extension of eigenvalues (ML w/AEE) model to integrate the two pieces seamlessly and accelerate the optimization process by speeding up forward evaluation in GA and gradient calculation in grad-opt. We employ the divide-and-conquer strategy to further improve modeling efficiency and accelerate the design process and propose the use of a fusion module to allow for end-to-end gradient propagation. Two numerical examples are included to show the robustness and efficiency of the proposed method, compared with traditional approaches.