A resistor temperature noise model for FETs has been successfully applied to extrinsic FETs to predict the frequency dependence of minimum noise figure, F(min), and associated gain, G(A opt). The model gives a fixed relationship between F(min) and G(A opt), with one fitting parameter T(d). An extensive comparison to published results shows that the majority of FETs can be modelled with effective T(d) values (the temperature of the output resistor) between 300 and 700 K for all of frequencies (8 to 94 GHz), gate lengths (0.8 to 0.1-mu-m) and material types examined. The analysis shows that InP-based MODFETs exhibit significantly lower F(min) and higher G(A opt) than conventional and pseudomorphic GaAs-based MODFETs of the same gate length. The results suggest a high f(max) is a key factor for low noise figure.