Monitoring physical activity is crucial for assessing patient health, particularly in managing chronic diseases and rehabilitation. Wearable devices tracking physical movement play a key role in monitoring elderly individuals or patients with chronic diseases. However, sharing of this data is often restricted by privacy regulations such as GDPR, as well as data ownership and security concerns, limiting its use in collaborative healthcare analysis. Federated analytics (FA) offers a promising solution that enables multiple parties to gain insights without sharing data, but current research focuses more on data protection than actionable insights. Limited exploration exists on analyzing privacy-preserved, aggregated data to uncover patterns for patient monitoring and healthcare interventions. This paper addresses this gap by proposing FAItH, a dual-stage solution that integrates privacy-preserving techniques - Laplace, Gaussian, Exponential and Locally Differentially Private (LDP) noise - on statistical functions (mean, variance, quantile) within a federated analytics environment. The solution employs feature-specific scaling to fine-tune the privacy-utility trade-off, ensuring sensitive features are protected while retaining utility for less sensitive ones. After applying federated analytics (FA) with differential privacy (DP) to generate insights, we introduce clustering to identify patterns in patient activity relevant to healthcare. Using the Human Activity Recognition (HAR) dataset, FAItH shows that privacy-preserving configurations achieve clustering utility nearly equal to non-DP setups, outperforming privacy-preserving clustering algorithms. This balances privacy with effective insights. These results validate FA with DP as a viable solution for secure collaborative analysis in healthcare, enabling meaningful insights without compromising patient privacy.