The naive Bayesian classifier (NBC) is a supervised machine learning algorithm having a simple model structure and good theoretical interpretability. However, the generalization performance of NBC is limited to a large extent by the assumption of attribute independence. To address this issue, this paper proposes a novel attribute grouping-based NBC (AG-NBC), which is a variant of the classical NBC trained with different attribute groups. AG-NBC first applies a novel effective objective function to automatically identify optimal dependent attribute groups (DAGs). Condition attributes in the same DAG are strongly dependent on the class attribute, whereas attributes in different DAGs are independent of one another. Then, for each DAG, a random vector functional link network with a SoftMax layer is trained to output posterior probabilities in the form of joint probability density estimation. The NBC is trained using the grouping attributes that correspond to the original condition attributes. Extensive experiments were conducted to validate the rationality, feasibility, and effectiveness of AG-NBC. Our findings showed that the attribute groups chosen for NBC can accurately represent attribute dependencies and reduce overlaps between different posterior probability densities. In addition, the comparative results with NBC, flexible NBC (FNBC), tree augmented Bayes network (TAN), gain ratio-based attribute weighted naive Bayes (GRAWNB), averaged one-dependence estimators (AODE), weighted AODE (WAODE), independent component analysis-based NBC (ICA-NBC), hidden naive Bayesian (HNB) classifier, and correlation-based feature weighting filter for naive Bayes (CFW) show that AG-NBC obtains statistically better testing accuracies, higher area under the receiver operating characteristic curves (AUCs), and fewer probability mean square errors (PMSEs) than other Bayesian classifiers. The experimental results demonstrate that AG-NBC is a valid and efficient approach for alleviating the attribute independence assumption when building NBCs.