The hidden Markov model (HMM) provides a natural framework for modeling the dynamic evolution of latent diseases. The unknown probability matrices of HMMs can be learned through the well-known Baum-Welch algorithm, a special case of the expectation-maximization algorithm. In many disease models, the probability matrices possess nontrivial properties that may be represented through a set of linear constraints. In these cases, the traditional Baum-Welch algorithm is no longer applicable because the maximization step cannot be solved by an explicit formula. In this paper, we propose a novel approach to efficiently solve the maximization step problem under linear constraints by providing a Lagrangian dual reformulation that we solve by an accelerated gradient method. The performance of this approach critically depends on devising a fast method to compute the gradient in each iteration. For this purpose, we employ dual decomposition and derive Karush-Kuhn-Tucker conditions to reduce our problem into a set of single variable equations, solved using a simple bisection method. We apply this method to a case study on sports-related concussion and provide an extensive numerical study using simulation. We show that our approach is in orders of magnitude computationally faster and more accurate than other alternative approaches. Moreover, compared with other methods, our approach is far less sensitive with respect to increases in problem size. Overall, our contribution lies in the advancement of accurately and efficiently handling HMM parameter estimation under linear constraints, which comprises a wide range of applications in disease modeling and beyond.