Wearable augmented reality (WAR) combines a live view of a real scene with computer-generated graphic on resource-limited platforms. One of the crucial technologies for WAR is a real-time 6-DoF pose tracking, facilitating registration of virtual components within in a real scene. Generally, artificial markers are typically applied to provide pose tracking for WAR applications. However, these marker based methods suffer from marker occlusions or large viewpoint changes. Thus, a multi-sensor based tracking approach is applied in this paper, and it can perform real-time 6-DoF pose tracking with real-time scale estimation for WAR on a consumer smartphone. By combining a wide-angle monocular camera and an inertial sensor, a more robust 6-DoF motion tracking is demonstrated with the mutual compensations of the heterogeneous sensors. Moreover, with the help of the depth sensor, the scale initialization of the monocular tracking is addressed, where the initial scale is propagated within the subsequent sensor-fusion process, alleviating the scale drift in traditional monocular tracking approaches. In addition, a sliding-window based Kalman filter framework is used to provide a low jitter pose tracking for WAR. Finally, experiments are carried out to demonstrate the feasibility and robustness of the proposed tracking method for WAR applications. (C) 2017 Elsevier B.V. All rights reserved.