Two random variables X and Y on a common probability space are mutually completely dependent (m.c.d.) if each one is a function of the other with probability one. For continuous X and Y, a natural approach to constructing a measure of dependence is via the distance between the copula of X and Y and the independence copula. We show that this approach depends crucially on the choice of the distance function. For example, the Lp-distances, suggested by Schweizer and Wolff, cannot generate a measure of (mutual complete) dependence, since every copula is the uniform limit of copulas linking m.c.d. variables. Instead, we propose to use a modified Sobolev norm, with respect to which mutual complete dependence cannot approximate any other kind of dependence. This Sobolev norm yields the first nonparametric measure of dependence which, among other things, captures precisely the two extremes of dependence, i.e., it equals 0 if and only if X and Y are independent, and 1 if and only if X and Y are m.c.d. Examples are given to illustrate the difference to the Schweizer–Wolff measure.