There is mounting observational evidence that the expansion of our Universe is undergoing a late-time acceleration. Among many proposals to describe this phenomenon, the cosmological constant (Lambda) seems to be the simplest and the most natural explanation. However, despite its observational successes, such a possibility exacerbates the well known Lambda problem, requiring a natural explanation for its small, but nonzero, value. In this paper we consider a cosmological scenario driven by a varying cosmological term, in which the vacuum energy density decays linearly with the Hubble parameter, Lambda proportional to H. We show that this Lambda(t)CDM model is indistinguishable from the standard one (Lambda CDM) in that the early radiation phase is followed by a long dust-dominated era, and only recently the varying Lambda term becomes dominant, accelerating the cosmic expansion. In order to test the viability of this scenario, we have used the most recent type Ia supernova data, i.e., the High-Z SN Search Team and the Supernova Legacy Survey (SNLS) Collaboration data. In particular, for the SNLS sample we have found 0.27 <=Omega(m)<= 0.37 and 0.68 <= H-0 <= 0.72 (at 2 sigma), which is in good agreement with the currently accepted estimates for these parameters.