The human brain’s ultra-low power consumption and highly parallel computational capabilities can be accomplished by memristor-based convolutional neural networks. However, with the rapid development of memristor-based convolutional neural networks in various fields, more complex applications and heavier computations lead to the need for a large number of memristors, which makes power consumption increase significantly and the network model larger. To mitigate this problem, this paper proposes an SBT-memristor-based convolutional neural network architecture and a hybrid optimization method combining pruning and quantization. Firstly, SBT-memristor-based convolutional neural network is constructed by using the good thresholding property of the SBT memristor. The memristive in-memory computing unit, activation unit and max-pooling unit are designed. Then, the hybrid optimization method combining pruning and quantization is used to improve the SBT-memristor-based convolutional neural network architecture. This hybrid method can simplify the memristor-based neural network and represent the weights at the memristive synapses better. Finally, the results show that the SBT-memristor-based convolutional neural network reduces a large number of memristors, decreases the power consumption and compresses the network model at the expense of a little precision loss. The SBT-memristor-based convolutional neural network obtains faster recognition speed and lower power consumption in MNIST recognition. It provides new insights for the complex application of convolutional neural networks.