Least Squares Support Vector Regression (LSSVR) which is a least squares version of the Support Vector Regression (SVR) is defined with a regularized squared loss without epsilon-insensitiveness. LSSVR is formulated in the dual space as a linear equality constrained quadratic minimization which can be transformed into solution of a linear algebraic equation system. As a consequence of this system where the number of Lagrange multipliers is half that of classical SVR, LSSVR has much less time consumption compared to the classical SVR. Despite this computationally attractive feature, it lacks the sparsity characteristic of SVR due to epsilon-insensitiveness. In LSSVR, every (training) input data is treated as a support vector, yielding extremely poor generalization performance. To overcome these drawbacks, the epsilon-insensitive LSSVR with epsilon-insensitivity at quadratic loss, in which sparsity is directly controlled by the epsilon parameter, is derived in this paper. Since the quadratic loss is sensitive to outliers, its weighted version (epsilon insensitive WLSSVR) has also been developed. Finally, the performances of epsilon-insensitive LSSVR and epsilon-insensitive WLSSVR are quantitatively compared in detail with those commonly used in the literature, pruning-based LSSVR and weighted pruning-based LSSVR. Experimental results on simulated and 8 different real-life data show that epsilon-insensitive LSSVR and epsilon-insensitive WLSSVR are superior in terms of computation time, generalization ability, and sparsity.