Recognition of the characters has gained much attention due to its potential applications like document analysis, license plate detection, house number detection, virtual text entry system, etc., in pattern recognition. However, it is very challenging to recognize the characters under the variations in pattern, style, translation, scale, rotation. This work develops a computationally efficient deep learning model to recognize handwritten, printable, and gesticulated characters. For gesture, the NITS gesticulated database having 60 characters (10 digits, 26 English uppercase alphabets, 4 operations, 18 special symbols) is proposed with the variation in pattern, style, scale in this work. To evaluate the ability and robustness of the proposed model, the handwritten characters (MNIST, EMNIST), printable characters (SVHN, Chars74) databases are considered. This network achieves 94.55%, 89.54%, 87.33, and 93.90% recognition accuracy on NITS gesticulated, EMNIST merge (balanced), SVHN, and Chars74 databases.