The neural network ensemble (NNE) is a widely recognized and effective classification method. It is imperative to concurrently optimize the topological structures and weight parameters of all base neural networks (NNs) within the NNE. This simultaneous optimization is crucial for balancing the accuracy and diversity of the base NNs, leading to superior classification performance. The extreme learning machine (ELM) is a rapid non-iterative learning strategy that has garnered significant attention recently due to its robust generalization capabilities, though effective only for fixed-structure NNs. Hence, to tackle these intricate challenges, this paper introduces a novel ensemble learning algorithm based on a multimodal differential evolution (MMDE) and ELM, called MMDE-ELM-NNE. The proposed algorithm optimizes both the structures and weight parameters of all the base NNs within an NNE in one run. MMDE-ELM-NNE employs a unique coding scheme to optimize the structures and input parameters of the base NNs, while the ELM's output parameters are computed directly. Furthermore, the MMDE algorithm has been enhanced better to meet the high-dimensional optimization demands of NNE tasks, ultimately enhancing classification accuracy. The experiments on twenty multimodal optimization problems are carried out and compared with eleven existing algorithms to verify the effectiveness of the proposed MMDE. Additionally, the experimental results on classification benchmark datasets relying on a Friedman test demonstrate that MMDE-ELM-NNE significantly outperforms ensembles of MMDE or ELM in terms of generalization prowess. Furthermore, the proposed algorithm showcases competitive capabilities when compared against other state-of-the-art classification algorithms.