One-bit analog-to-digital. converters (ADCs) have garnered significant research interest in the field of massive multiple-input-multiple-output (MIMO) systems, due to their ability to greatly reduce hardware costs and power consumption. In this article, we aim to investigate the performance of matched filtering (MF) in massive MIMO radar that utilizes one-bit ADCs. First, we demonstrate that in the presence of temporally uncorrelated Gaussian noise, the MF output of the one-bit quantized received signals in massive MIMO radar approaches a Gaussian distribution asymptotically. Subsequently, we derive the statistical characteristics of the MF output, including the mean and covariance matrix, under two different noise scenarios: 1) white Gaussian noise and 2) spatially correlated Gaussian noise. These analyses provide valuable insights into the behavior of the MF output. Importantly, exploiting the fact that massive MIMO radar involves a large number of measurements (i.e., samples in space, frequency, or time domains), we present an approximate probabilistic distribution for the MF output. This approximation enables the development of low-complexity signal processing algorithms for one-bit MIMO radar. Furthermore, based on the aforementioned approximations, we mathematically derive the performance gap between one-bit and traditional high-resolution MIMO radars. Finally, we conduct representative simulations from the perspectives of target detection and beamforming to illustrate the performance of massive MIMO radar employing one-bit ADCs