Deep learning has achieved great success in the past years. However, due to the uncertainty in the real world, the concerns on building reliable models have been raised. However, most current strategies can't achieve this goal in a unified way. Since the recently developed evidential reasoning rule (ER2) which is a general and interpretable probabilistic inference engine can integrate reliability to realize adaptive evidence combination and overall reliability is introduced to measure the credibility of output, it is an ideal strategy to help deep learning build more reliable model. As such, a new deep evidential reasoning rule learning method (DER2) is developed in this study. DER2 consists of training, adaptation and testing stage. In training stage, deep neural network with multiple fully connected layers is trained. In adaptation stage, reliability is introduced to tune the trained model to obtain the adapted output for a given test sample. In testing stage, not only the predictive output probability is obtained, but also the overall reliability is estimated to measure the credibility of model output so that the decision maker can determine whether the predictive results should be trusted or not. Meanwhile, the model output can be interpreted through the case-based way. The experimental results demonstrated that DER2 can obtain better performance when introducing adaptation stage and a high-quality credibility measurement can be realized through overall reliability as well.