A table is a compact, effective and structured way of representing information in any document. Automatic localization of tables in scanned handwritten document images, and extracting the information are very critical and challenging task for applications like Optical Character Recognition, handwriting analysis, and auto-evaluation systems. The same task becomes more complex, when the handwritten document images are acquired through handheld mobile-cameras, because the captured images naturally get distorted due to poor illumination, device vibration, camera-angle, camera-orientation, camera-movement, and camera-distance. In this research article, a novel technique of automatic localization and segmentation of tables in handwritten document images which are captured using a handheld mobile-camera is proposed. Generally, ruling lines are used for structuring tables, sketching figures, and scribing scientific equations. In the current research work, tables are detected and extracted based on edge features of the ruling lines subjected to three main stages. Firstly, block-wise mean-computed fuzzy based binarization technique is proposed for analyzing the distortion in the acquired image, and subsequently the background surface that envelops the document area of the image is removed. Secondly, horizontal and vertical granule or strip-based technique is proposed for fast edge-feature extraction from the ruling lines of the table in the binarized image. Finally, entropy quantifiers are employed for segmenting the table in the image. The performance of the proposed technique is evaluated and reported using the proposed composite handwritten benchmark daset. Linear computational benefit 0(h x w) is observed in the worst-case tolerance.