It is natural to think of Evolutionary Algorithms as highly stochastic search methods. This can also make Evolutionary Algorithms, and particularly recombination, quite difficult to analyze. One way to reduce randomness involves the quadratization of functions, which is commonly used by modern optimization methods, and also has applications in quantum computing. After a function is made quadratic, random mutation is obsolete and unnecessary; the location of improving moves can be calculated deterministically, on average in O(1) time. Seemingly impossible problems, such as the Needle-in-a-Haystack, becomes trivial to solve in quadratic form. One can also provably tunnel, or jump, between local optima and quasilocal optima in O(n) time using deterministic genetic recombination. The talk also explores how removing randomness from Evolutionary Algorithms might provide new insights into natural evolution. Finally, a form of evolutionary algorithm is proposed where premature convergence is impossible and the evolutionary potential of the population remains open-ended.