In this paper, we first propose a new estimator of entropy for continuous random variables. Our estimator is obtained by correcting the coefficients of Vasicek's [A test for normality based on sample entropy, J. R. Statist. Soc. Ser. B 38 (1976), pp. 54-59] entropy estimator. We prove the consistency of our estimator. Monte Carlo studies show that our estimator is better than the entropy estimators proposed by Vasicek, Ebrahimi et al. [Two measures of sample entropy, Stat. Probab. Lett. 20 (1994), pp. 225-234] and Correa [A new estimator of entropy, Commun. Stat. Theory Methods 24 (1995), pp. 2439-2449] in terms of root mean square error. We then derive the non-parametric distribution function corresponding to our proposed entropy estimator as a piece-wise uniform distribution. We also introduce goodness-of-fit tests for testing exponentiality and normality based on the said distribution and compare its performance with their leading competitors.