This study presents an optimized recurrent neural network architecture, termed "RegGRU-Opt", designed for binary classification tasks such as sentiment analysis on Amazon reviews. By integrating gated recurrent units (GRUs), strategic dropout layers, L2 regularization in dense layers, and advanced optimization techniques (e.g., early stopping and learning rate scheduling), the proposed model achieves robust generalization and outperforms baseline models like CNN, RNN, and SeqClassRNN. Experimental results show high accuracy (approximate to 98%), F1-Score (approximate to 97%), and ROC AUC (approximate to 99%), underscoring the model's capacity to capture nuanced textual patterns. Data balancing methods, including SMOTE and SMOTE + Tomek Links, ensure fair representation of both classes, while feature importance analyses (Chi-squared and SHAP) enhance model interpretability. Despite improved performance, challenges remain in terms of domain adaptation, computational resources, and handling imbalanced data. Future work will focus on exploring hybrid models, advanced regularization methods, broader datasets, and improved efficiency, ultimately expanding the applicability of RegGRU-Opt across diverse real-world text classification scenarios.