The minimum disparity estimators proposed by Lindsay (1994) for discrete models form an attractive subclass of minimum distance estimators which achieve their robustness without sacrificing first order efficiency at the model. Similarly, disparity test statistics are useful robust alternatives to the likelihood ratio test for testing of hypotheses in parametric models; they are asymptotically equivalent to the likelihood ratio test statistics under the null hypothesis and contiguous alternatives. Despite their asymptotic optimality properties, the small sample performance of many of the minimum disparity estimators and disparity tests can be considerably worse compared to the maximum likelihood estimator and the likelihood ratio test respectively. In this paper we focus on the class of blended weight Hellinger distances, a general subfamily of disparities, and study the effects of combining two different distances within this class to generate the family of ''combined'' blended weight Hellinger distances, and identify the members of this family which generally perform well. More generally, we investigate the class of ''combined and penalized'' blended weight Hellinger distances; the penalty is based oil reweighting the empty cells, following Harris and Basu (1994). It is shown that some members of the combined and penalized family have rather attractive properties.