The deficiency of [CII] (158 mm) line emission in many normal and ultraluminous galaxies is one of the major surprises from ISO-LWS observations. We show that this is not an isolated phenomenon: there is a smooth decline in L-[CII]/L-FIR ratio with increasing dust temperature las indicated by far-infrared colors F-v(60mm)/F-v(100 mu m), i.e. F60/F100) and star-formation activity (indicated by L-FIR/L-B), independent of their luminosity or morphology. In a sample of 60 normal galaxies, these trends span a factor of 100. Of the numerous explanations proposed for the L-[CII]/L-FIR variation the leading ones are (a) optical depth and extinction, (b) softer radiation field from old stellar populations (c) inefficient photoelectric heating by charged grains when the UV radiation density per gas atom (G(0)/n) is high. We can rule out hypothesis (a) with the observations that the [OI]/[CII] line ratio increases for galaxies with higher F60/F100. This is contrary to the expectation that [OI] at 63 mu m should be more severely affected by extinction because it is at a shorter wavelength. Optical depth should also affect [OI] 63 mm line more strongly because OI exists deeper (to A(v) = 10 in the interior of clouds than [CII]. Hypothesis explains the slight decrease in L-[CII]/L-FIR Seen in early type galaxies with low rates of star-formation and the lowest L-FIR/L-B in the sample. The dramatic fall in L-[CII]/L-FIR for the warmest and most actively star-forming galaxies is best explained by hypothesis (c). In galaxies with warmer dust, there is less cooling via the [CII] line, while [OI] remains a major coolant. This trend is qualitatively explained in PDR models by an increase in radiation field G(0), which raises the dust temperature and the [OI]/[CII] line ratio.