MCMC sampling is a methodology that is becoming increasingly important in statistical signal processing. It has been of particular importance to the Bayesian-based approaches to signal processing since it extends significantly the range of problems that they can address. MCMC techniques generate samples from desired distributions by embedding them as limiting distributions of Markov chains. There are many ways of categorizing MCMC methods, but the simplest one is to classify them in one of two groups: the first is used in estimation problems where the unknowns are typically parameters of a model, which is assumed to have generated the observed data; the second is employed in more general scenarios where the unknowns are not only model parameters, but models as well. In this paper, we address the MCMC methods from the second group, which allow for generation of samples from probability distributions defined on unions of disjoint spaces of different dimensions. More specifically, we show why sampling from such distributions is a nontrivial task. It will be demonstrated that these methods genuinely unify the operations of detection and estimation and thereby provide great potential for various important applications. The focus is mainly on the reversible jump MCMC (Green, Biometrika 82 (1995) 711), but other approaches are also discussed. Details of implementation of the reversible jump MCMC are provided for two examples. (C) 2001 Elsevier Science B.V. All rights reserved.