Self-diffusion has been experimentally studied in a two-dimensional underdamped liquid complex (dusty) plasma. It was found that the self-diffusion coefficient D increases linearly with the temperature T: D/omega(E)a(2)=(0.019 +/- 0.007)(T/T-m-1), where T-m, omega(E), and a are the melting temperature, the Einstein frequency, and the mean particle separation, respectively. No superdiffusion was observed, whereas a subdiffusion occurred at temperatures close to melting.