The combination of Unmanned Aerial Vehicles (UAVs) and Mobile Edge Computing (MEC) effectively meets the demands of user equipments (UEs) for high-quality computing services, low energy consumption, and low latency. However, in complex environments such as disaster rescue scenarios, a single UAV is still constrained by limited transmission power and computing resources, making it difficult to efficiently complete computational tasks. To address this issue, we propose a UAV swarm-enabled MEC system that integrates data compression technology, in which the only swarm head UAV (USH) offloads the compressed computing tasks compressed by the UEs and partially distributes them to the swarm member UAV (USM) for collaborative processing. To minimize the total energy and time cost of the system, we utilize Markov Decision Process (MDP) for modeling and construct a deep deterministic policy gradient offloading algorithm with a prioritized experience replay mechanism (PER-DDPG) to jointly optimize compression ratio, task offloading rate, resource allocation and swarm positioning. Simulation results show that compared with deep Q-network (DQN) and deep deterministic policy gradient (DDPG) baseline algorithms, the proposed scheme performs excellently in terms of convergence and robustness, reducing system latency and energy consumption by about 32.7%.