For finding a stationary min-max point of a scalar-valued function, we develop and investigate a family of gradient transformation differential equation algorithms. This family includes, as special cases: Min-Max Ascent, Newton's method, and a Gradient Enhanced Min-Max (GEMM) algorithm that we develop. We apply these methods to a sharp-spined "Stingray" saddle function, in which Min-Max Ascent is globally asymptotically stable but stiff, and Newton's method is not stiff, but does not yield global asymptotic stability. However, GEMM is both globally asymptotically stable and not stiff. Using the Stingray function we study the stiffness of the gradient transformation family in terms of Lyapunov exponent time histories. Starting from points where Min-Max Ascent, Newton's method, and the GEMM method do work, we show that Min-Max Ascent is very stiff. However, Newton's method is not stiff and is approximately 60 to 440 times as fast as Min-Max Ascent. In contrast, the GEMM method is globally convergent, is not stiff, and is approximately 3 times faster than Newton's method and approximately 175 to 1000 times faster than Min-Max Ascent.