Code generators are frequently used when language-independent specifications are compiled into client libraries to support multiple languages. One example is the message definition specification of the Robot Operating System (ROS). This work discusses how a configurable code generator for reconfigurable hardware built using a model-based toolchain based on attribute grammars is tested during development. It supports multiple input and output variants for different source and target languages. To ensure the correctness of all potentially generatable code, a modular test toolchain is provided that can be extended to support different client libraries. Using it, we can identify bugs concerning specification divergence of the tool under test for all current ROS distributions. In this work, we present insights obtained during the design and execution of the test system.