Hardfacing alloys were weld-deposited on a base material to provide a wear-resistant surface. Commercially available iron-based hardfacing alloys were evaluated for replacement of cobalt-based alloys to reduce nuclear plant activation levels. Corrosion testing in high-oxygenated environments indicated that commercial iron-based hardfacing alloys in the as-deposited condition have acceptable corrosion resistance when the chromium:carbon was > 4. Tristelle 5183, with a high niobium (stabilizer) content, did not follow this trend because of precipitation of niobium-rich carbides instead of chromium-rich carbides. This result indicated that iron-based hardfacing alloys containing high stabilizer contents may possess good corrosion resistance with chromium:carbon < 4. NOREM 02, NOREM 01, and NoCo-M2 hardfacing alloys had acceptable corrosion resistance in the as-deposited condition and after being heat-treated for 4 h at 885degreesC (885degreesC/4-h), but rusting from a type of sensitization was observed for samples heat-treated for 6 h at 621degreesC (621degreesC16-h). The feasibility of using an electrochemical potentiokinetic reactivation (EPR) test method, such as that used for stainless steel (SS), to detect sensitization in iron-based hardfacing alloys, was evaluated. A single loop-EPR method provided a more consistent measurement of sensitization than a double loop-EPR method. The high carbon content needed for a wear-resistant hardfacing alloy produced a high volume fraction of chromium-rich carbides that were attacked during EPR testing. This resulted in inherently lower sensitivity for detection of a sensitized iron-based hardfacing alloy than SS using conventional EPR test methods.