We investigate the effects of dust on Large Magellanic Cloud (LMC) H II region spectral energy distributions using arcminute-resolution far-ultraviolet (FUV), Halpha, far-infrared (FIR), and radio images. Widely used indicators of the amount of light lost to dust (attenuation) at Ha and in the FUV correlate with each other, although often with substantial scatter. There are two interesting systematic discrepancies: First, Halpha attenuations estimated from the Balmer decrement are lower than those estimated from the Halpha-to-thermal radio luminosity ratio. Our data, at this stage, cannot unambiguously identify the source of this discrepancy. Second, the attenuation at 1500 Angstrom and the UV spectral slope, beta, correlate, although the slope and scatter are substantially different from the correlation first derived for starbursting galaxies by Calzetti et al. Combining our result with those of Meurer et al. for ultraluminous infrared galaxies and Calzetti et al. for starbursting galaxies, we conclude that no single relation between b and 1500 attenuation is applicable to all star-forming systems.