To support early model validation, this paper describes a method utilizing information,obtained from the common practice component level validation to assess uncertainties on model top level. Initiated in previous research, a generic output uncertainty, description component, intended for power-port based simulation models of physical systems, has been implemented in Modelica. A set of model components has been extended with the generic output uncertainty description, and the concept of using component level output uncertainty to assess model top level uncertainty has been applied on a simulation model of a radar liquid cooling system. The focus of this paper is on investigating the applicability of combining the output uncertainty method with probabilistic techniques, not only to provide upper and lower bounds on model uncertainties but also to accompany the uncertainties with estimated probabilities. It is shown that the method may result in a significant improvement in the conditions for conducting an assessment of model uncertainties. The primary use of the method, in combination with either deterministic or probabilistic techniques, is in the early development phases when system level measurement data are scarce. The method may also be used to point out which model components contribute most to the uncertainty on model top level. Such information can be used to concentrate physical testing activities to areas where it is needed most. In this context, the method supports the concept of Virtual Testing.