Data quality in IoT and smart manufacturing environments is essential for optimizing workflows, enabling predictive maintenance, and supporting informed decisions. However, data from sensors present significant challenges due to their real-time nature, diversity of formats, and high susceptibility to faults such as missing values or inconsistencies. Ensuring high-quality data in these environments is crucial to maintaining operational efficiency and process reliability. This paper analyzes some of the data quality metrics presented in the literature, with a focus on adapting them to the context of Industry 4.0. Initially, three models for the classification of the dimensions of data quality are presented, proposed by different authors, which group together dimensions such as accuracy, completeness, consistency, and timeliness in different approaches. Next, a systematic methodology is adopted to evaluate the metrics related to these dimensions, always using a real-time monitoring scenario. This approach combines dynamic thresholds with historical data to assess the quality of incoming data streams and provide relevant insights. The analysis carried out not only facilitates continuous monitoring of data quality but also supports informed decision-making, helping to improve operational efficiency in Industry 4.0 environments. Finally, this paper presents a table summarizing the selected metrics, highlighting the advantages, disadvantages, and potential usage scenarios, and providing a practical basis for implementation in real environments.