Text classification is a method that assigns a specific category to each piece of written information. It is one of the fundamental tasks in natural language processing that has a wide range of applications like spam detection, sentiment analysis, etc. One type of text classification is news classification which can help the reader to focus on news as per their choice. In this paper, we propose a novel method for multiclassification of Punjabi news articles using a pretrained language generation model based optimized and regularized long short-term memory model. The proposed method employs Averaged Stochastic Gradient Descent Weight-Dropped LSTM model, which uses a recurrent regularization technique known as DropConnect on hidden-to-hidden weights and a variant of the averaged stochastic gradient method wherein the averaging trigger is determined using a non-monotonic condition instead of being tuned by the user. The proposed news classification method works in three stages. In the first stage, we train a language model on Punjabi text acquired from Wikipedia, and in the second stage, we fine-tune the language model on the Punjabi news dataset. Finally, we train a classifier using the pretrained encoder part of the language model. The pretrained encoder part of the language model helps the classifier in the linguistic understanding of the text, resulting in better classification results on the text. The results obtained from the proposed work indicate that the proposed method outperforms the other direct methods of news classification, which are not using pretrained language generation models.