Automated individual animal behavior surveillance, by means of low-cost cameras and computer vision techniques, has the ability to generate continuous data providing an objective measure of behavior, without disturbing the animals. The specific purpose of this current study was to develop an automatic computer vision technique to quantify six types of behavior of an individual laying hen (standing, sitting, sleeping, preening, scratching, pecking) continuously and compare them with the current human visual observation. For this purpose, a model-based algorithm has been developed, based on the fact that behavior can be described as a time-series of different subsequent postures. The quantification of the hen's posture consists of its position, orientation and a set of parameters describing its shape, obtained by fitting a point distribution model to the hen's outline. Applying this algorithm to subsequent images in a video sequence, the successive values of the hen's posture parameterization represent the hen's behavior within that sequence. A model for each behavior type is created by clustering the set of posture parameterizations calculated from training video sequences with known behavior, provided by a trained ethologist. For the classification of the unknown behavior in a new video fragment, its posture parameter time series are calculated using the same algorithm and matched to each of the trained behavior models. The behavior in a new video fragment is then classified as the behavior type for which the model gives the best match. The system was tested on a set of over 14000 video fragments of a single hen in a cage, each fragment containing one of the six behavior types. The average classification rate was between 70%-96%, except 21% for pecking, due to an unreliable tracking of the chicken's head. Best results were obtained for sleeping (96%) and standing (90%).