Let F denote a distribution function defined on the probability space (OMEGA, F, P), which is absolutely continuous with respect to the Lebesgue measure in R(d) with probability density function f. Let f0 (., beta) be a parametric density function that depends on an unknown p x 1 vector beta. In this paper, we consider tests of the goodness-of-fit of f0 (., beta) for f(.) for some beta based on (i) the integrated squared difference between a kernel estimate of f(.) and the quasi-maximum likelihood estimate of f0 (., beta) denoted by I(n) and (ii) the integrated squared difference between a kernel estimate of f(.) and the corresponding kernel smoothed estimate of f0 (., beta) denoted by J(n). It is shown in this paper that the amount of smoothing applied to the data in constructing the kernel estimate of f(.) determines the form of the test statistic based on I(n). For each test developed, we also examine its asymptotic properties including consistency and the local power property. In particular, we show that tests developed in this paper, except the first one, are more powerful than the Kolmogorov-Smirnov test under the sequence of local alternatives introduced in Rosenblatt [12], although they are less powerful than the Kolmogorov-Smirnov test under the sequence of Pitman alternatives. A small simulation study is carried out to examine the finite sample performance of one of these tests.