This paper considers the development of microbiological risk assessment models for pathogenic agents in drinking water with particular reference to Cryptosporidium parvum, rotavirus and bovine spongiform encephalopathy (BSE). The available evidence suggests that there is potential for considerable variation in exposures to C. parvum oocysts through drinking water, during both outbreak and non-outbreak conditions. This spatial/temporal heterogeneity arises both from variation in oocyst densities in the raw water and fluctuations in the removal efficiencies of drinking water treatment. In terms of risk prediction, modelling the variation in doses ingested by individual drinking water consumers is not important if the dose-response curve is linear and the oocysts act independently during infection. Indeed, the total pathogen loading on the population as represented by the arithmetic mean exposure is sufficient for risk prediction for C. parvum, BSE and other agents of low infectivity, providing the infecting particles (i.e. oocysts or BSE prions) are known to act independently. However, for more highly infectious agents, such as rotavirus, ignoring the variation and just using the arithmetic mean exposure may over-estimate the risk by a factor of about threefold. If it were to be shown that pathogens co-operate with each other during initiation of infection, such that the dose-response relationship is non-linear, then modelling the variation in doses ingested by individual consumers would be very important. Possible mechanisms for co-operation of pathogens during infection are considered. Simulations show that acquired protective immunity for C. parvum reduces the risk of infection during outbreak conditions by over 10-fold. Variation in virulence between strains of C. parvum is a further source of uncertainty.