We consider the method for constrained convex optimization in a Hilbert space, consisting of a step in the direction opposite to an epsilon(k)-subgradient of the objective at a current iterate, followed by an orthogonal projection onto the feasible set. The normalized stepsizes alpha(k) are exogenously given, satisfying Sigma(k=0)(infinity), alpha(k) = infinity, Sigma(k=0)(infinity) alpha(k)(2) < infinity, and epsilon(k) is chosen so that epsilon(k) less than or equal to mu alpha(k) for some mu > 0. We prove that the sequence generated in this way is weakly convergent to a minimizer if the problem has solutions, and is unbounded otherwise. Among the features of our convergence analysis, we mention that it covers the nonsmooth case, in the sense that we make no assumption of differentiability of f, and much less of Lipschitz continuity of its gradient. Also, we prove weak convergence of the whole sequence, rather than just boundedness of the sequence and optimality of its weak accumulation points, thus improving over all previously known convergence results. We present also convergence rate results. (C) 1998 The Mathematical Programming Society, Inc. Published by Elsevier Science B.V.