Augmented reality allows users to visualize annotations, videos, and images overlaid on physical objects through the use of a camera. However, the high computational processing cost of matching an image seen through a camera with that in an enormous database of images makes it daunting to use the concept of augmented reality on a smartphone. As matching an image with another takes time, some researchers leverage Global Positioning System (GPS) for localizing outdoor objects. Tagging images with GPS location reduces the number of images that are required to find a match which improves the overall efficiency. Unfortunately, this approach is not suitable for indoor environment as GPS does not work in indoor environments. To address this problem, we propose a system for mobile augmented reality (MAR) in indoor environments. By leveraging the already available Wi-Fi infrastructure, we estimate the location of the users inside a building to narrow down the search space. Furthermore, we utilize the smartphone motion sensors such as accelerometers and magnetometers to detect the phone's direction towards an object, and also to capture the inclination degree of the smartphone to further reduce the search domain for an object. We deployed the system in a building at Florida State University. We tested our proposal and found that using the system we decreased the matching time significantly. Due to refining the search domain of the annotated image database, MAR uses the object recognition algorithm more efficiently and decreases the matching time from 2.8 s to just 17 ms with a total of 200 annotated images.