A new type of feedback strategy, for stabilization to a point, for a class of drift free systems, is presented. The approach is based on the construction of a cost function which is a maximum of a finite number of component functions. The stabilizing control is defined in terms of a set of nested, discrete processes, whose task is to minimize the non-differentiable cost. Repeated application of these processes yields a sequence of points along the controlled trajectory. While the corresponding sequence of cost values is decreasing monotonically, the cost, as a continuous function of time, decays along the controlled system trajectories only asymptotically. Stabilizing properties of the resulting feedback strategy are discussed.