With the arrival of the 6G era, there is a rapid increase in computational demands, and multiaccess edge computing (MEC) has emerged as an effective mechanism to satisfy these needs. In disaster-affected or remote areas where ground base stations may struggle to provide service, unmanned aerial vehicle (UAV), and high-altitude platforms (HAPs) can leverage their flexible deployment capabilities to offer MEC services. In this article, we design a layered aerial computing framework comprising user equipments (UEs), UAV, and HAP. Building upon this, we establish a hierarchical offloading computation model and formulate an optimization problem to maximize resource utilization and task scheduling within the model. Specifically, the objective is to minimize task computation delay and maximize the remaining energy of the UAV, i.e., minimize energy consumption, subject to constraints on the total task quantity, delay requirements and UAV energy limitations. Due to the nonconvex and highly complex nature of this objective problem, we propose a deep reinforcement learning-based trajectory optimization and task offloading (DTOTO) algorithm that enables the agent, the UAV, to make correct decisions in complex environments and high-dimensional action spaces. The algorithm is capable of optimizing the UAV's trajectory to obtain the correct offloading decisions. Additionally, we employ state normalization to improve training efficiency. Simulation experiment results validate the effectiveness of the computing framework and the DTOTO algorithm, and numerical results analyze to evaluate system performance.