For classification problems, the traditional least squares twin support vector machine (LSTSVM) generates two nonparallel hyperplanes directly by solving two systems of linear equations instead of a pair of quadratic programming problems (QPPs), which makes LSTSVM much faster than the original TSVM. But the standard LSTSVM adopting quadratic loss measured by the minimal distance is sensitive to noise and unstable to re- sampling. To overcome this problem, the expectile distance is taken into consideration to measure the margin between classes and LSTSVM with asymmetric squared loss (aLSTSVM) is proposed. Compared to the original LSTSVM with the quadratic loss, the proposed aLSTSVM not only has comparable computational accuracy, but also performs good properties such as noise insensitivity, scatter minimization and re-sampling stability. Numerical experiments on synthetic datasets, normally distributed clustered (NDC) datasets and University of California, Irvine (UCI) datasets with different noises confirm the great performance and validity of our proposed algorithm. © 2023, Beijing University of Posts and Telecommunications. All rights reserved.