An effective system for evaluating field reliability depends as much on the analytical tools it employs, as on its ability to zoom in to relevant subsets of the population. Digital projectors are designed to operate in a large gamut of applications and user profiles. As a result, the LED projector population is highly non-homogenous. Being able to identify the differences in performance between various user profiles, customers and manufacturing vintages, provides valuable insight, greatly enhancing the investigative process and facilitating a customized approach to customer care. However, when analysis is constrained by left-censored data, developing analytical methods suitable for truncation is crucial for generating meaningful reliability metrics and ensuring no available data is left unexplored. The inadequacy of one single metric to characterize the health of a non-homogenous population is emphasized by the highly-evolved analytical needs of both internal and external stakeholders. Customers are progressively tech-savvy and discerning in reliability matters. To understand and predict cost of ownership, many customers inquire about repair trends extending beyond warranty and request periodic updates on field performance and reliability growth. This underlines the need for reliability metrics to move beyond MTBF point estimates, which assume constant failure rate [1] and homogenous behavior, while providing no insight into age-based reliability, no means to quantify improvement and no modeling for predicting future performance. This paper describes the analytical methods employed in an interactive dashboard implemented to monitor the field reliability of a fleet of over 900 LED projectors. Special consideration is given to the approach used to address left-truncation of repair data. The dashboard presents a powerful combination of parametric and non-parametric tools for repairable systems. The Mean Cumulative Function (MCF) is used to gain insight into age-dependent reliability performance. The Crow-AMSAA Reliability Growth model is used to quantify reliability improvement and the Annualized Failure Rate (AFR) is calculated for each year in service, providing another powerful reliability metric that complements the graphical models of MCF and Crow-AMSAA. In the case study presented, the field repair data is left-censored, due to missing repair records in the seventeen months after product launch. Filters are used to slice data into more homogeneous subsets by year of deployment [2] and truncation is addressed in a composite approach, starting at the drill-down level, using linear regression of the MCF. The introduction of the interactive reliability dashboard and its integration into the problem solving process was very successful, due to the capacity to lead to meaningful inquiry and provide actionable insight. Sharing these graphical tools with customers, facilitated trust growth, as transparency was greatly enhanced. The narrative of the combined analytical tools and its potential for adding clarity to the decision-making process, led to a renewed interest in the reliability discipline. It also led to an increased expectation of learning to harness the predictive power of the dashboard's analytics to inform decisions on extended warranty provision, forecasting and planning.