Split analog-to-digital converter (ADC) digital background calibration with full-input-range error detection schemes is proposed to rapidly correct the gain and nonlinearity errors in the multi-bit first stage of a multi-channel time-interleaved (TI) pipelined ADC. By adding a vertical shift between the residue transfer curves of the first stages in two half-ADCs which are split from a single ADC, the error detection schemes of the proposed calibration are effective in the full-input range. The larger error detection range means that calibration is activated more often, resulting in fewer ADC conversions to converge. In addition, the designed fast-settling switch controller enables a 12-bit resistor-ladder DAC (R-DAC) for high-speed application. Furthermore, by applying the proposed calibration and sharing the R-DAC among all channels, the need for gain mismatch calibration between interleaved channels is eliminated. Consequently, the calibration time and complexity are further reduced. A 12-bit 400-MS/s 4-channel TI pipelined ADC prototype is implemented in 40-nm CMOS technology with an active area of 0.71 mm(2), the measured SNDR and INL of which are improved up to 23 dB and 96 LSB via the proposed calibration. Compared with prior-art ADCs using background calibration, the proposed ADC achieves the fastest background calibration in 4,000 conversions/channel, which is at least $5{\times }$ less than the others.