With this article, we present the first - to our knowledge - development system for developing Augmented Virtuality (AV) applications with dynamic and tangible interfaces. In addition, we also propose a newtaxonomy for differentiating between various types of AV applications. The System: The system presented in this article, is developed for the Meta Quest 2 VR headset, and automatically handles the synchronization between virtual objects and their physical counterparts using the controller's position. To ensure persistent real time alignment and synchronization during physical manipulations, it uses sensor feedback wirelessly transmitted from microcontrollers through UDP. In addition, the implemented user-interface for Unity enables users to use the system without the need for writing code. Furthermore, we have released the system as an open-source Unity assets package and invite others to use it for AVresearch-, educational-, industry-, or gaming purposes: https://github.com/BP-GITT/Tangible-VR/releases/tag/v1.0.0. Taxonomy: The article also provides an overview of identified types, approaches, and methods for achieving Augmented Virtuality and propose a new practical orientated taxonomy for how to differentiate between them. The overview and taxonomy are a great place for AV application developers and new researchers to begin and includes the following six dimensions: 1) Type of AV, 2) approach to alignment, 3) synchronization method and sub-method, 4) number of synchronizable objects 5) the flexibility of their positioning and 6) their state.