Compressed sensing (CS) is a well-established signal reconstruction technique based on sub-Nyquist sampling rates (SRs), widely used in image reconstruction and ultrawideband (UWB) sensor systems. In the field of image reconstruction, deep unfolding networks (DUNs) have been shown to improve the performance of end-to-end image CS (ICS) due to their strong interpretability. DUNs are, however, often associated with high computational complexity and time consumption. ICS based on convolutional neural networks (CNNs) may, moreover, result in the generation of block effects with inherent limitations of receptive fields. The transformer, which has made notable advancements in capturing global features, has received increasing attention. This article proposes the Transformer-CNN Cooperative Network for ICS (dubbed TCC-CSNet), a dual branch network that uses parallel and fusion connections of transformer and CNN, which combines the advantages of capturing local and global features to enhance image reconstruction quality. The experimental results demonstrate that TCC-CSNet outperforms previous methods in terms of image reconstruction quality and noise robustness while also alleviating computational complexity. Meanwhile, considering the wide applicability of CS, we use the transformer branch in TCC-CSNet to achieve accurate reconstruction of UWB radar motion data, and the effectiveness of the proposed method is verified using radar signal data acquired by a P440 UWB sensor.