An Expectation-Maximization (EM) training algorithm to estimate the parameters of a special Probability Neural Network (PNN) structure which forms a multicatolog Bayes classifier is proposed in this paper. The structure of PNN is a four-layer Feedforward Neural Networks (FNN), where the Parzen gaussian probability density function is regarded as a internal node. In this way, the EM algorithm is extended to deal with supervised learning one muticatolog of the neural network Gaussian classifier. The computational efficiency and the numerical stability of the training algorithm benefit from the well-established EM framework. The effectiveness of the proposed network architecture and its EM training algorithm are assessed by an experiment.