Nonparametric regression techniques have been studied extensively in the literature in recent years due to their flexibility. In addition, robust versions of these techniques have become popular and have been incorporated into some of the standard statistical analysis packages. With new techniques available comes the responsibility of using them properly and in appropriate situations. Often, as in the case presented here, model-fitting diagnostics, such as cross-validation statistics, are not available as tools to determine if the smoothing parameter value being used is preferable to some other arbitrarily chosen value. We present not only a robust nonparametric regression technique that is a strong competitor to the current standard (Loess (Cleveland, 1979)), but also an adjusted cross-validation statistic that can be used to select the bandwidth when it can be assumed that outliers are contained in the data. We present the form of the estimators to be compared, the theoretical bias and variance calculations based on the underlying model that we assume, the cross-validation technique and the rationale for its components, and a simulation study (single regressor case) that is employed to compare the estimators across varying sample sizes and departures of the true curve from the assumed model. The robust local linear regression (RLLR) fitting procedure using the adjusted cross-validation statistic demonstrates superiority over currently available fitting techniques (Loess using the default bandwidth value provided by a popular statistical software package, and M-Regression) when there is a moderate to significant amount of curvature in the true underlying model. In addition, simulations indicate that the proposed technique may be the superior fitting method, based on mean squared error values averaged across simulated fits, for relatively small sample sizes (n less than or equal to 50) across a varying amounts of curvature over Loess and M-Regression.