In this paper, we present a theoretical framework for interpreting the hot-spot electron temperature (T-e) inferred from hard (10- to 20-keV) x-ray continuum emission for inertial confinement fusion implosions on OMEGA. We first show that the inferred T-e represents the emission-weighted, harmonic mean of the hot-spot T-e distribution, both spatially and temporally. A scheme is then provided for selecting a photon energy of which the emission weighting approximates neutron weighting. Simulations are then used to quantify the predicted relationship between the inferred T-e, neutron-weighted T-i, and implosion performance on OMEGA. In an ensemble of 1-D simulations, it was observed that hot-spot thermal nonequilibrium precluded a sufficiently unique mapping between the inferred T-e and neutron-weighted T-i. The inferred T-e and hard x-ray yield's sensitivity to implosion asymmetry was studied using a 3-D simulation case study with low-harmonic-mode perturbations (i.e., laser beam power imbalance, target offset, and beam port geometry departures from spherical symmetry) and laser imprint (l(max)=200).