The new MPEG-4 standard, scheduled to become an International Standard in February 1999 will include support nor only for natural video and audio, but also for synthetic graphics and sounds. In particular representation of human faces and bodies will be supported. In the current draft specification of the standard [MPEG-N1901, MPEG-N1902] Facial Animation Parameters (FAPs) and Facial Definition Parameters (FDPs) are defined. FAPs are used to control facial animation at extremely low bitrates (approx. 2 kbit/sec). FDPs are used to define the shape of the face by deforming a generic facial model, or by supplying a substitute model. We present algorithms to interpret the part of FDPs dealing with the deformation of a generic facial model, leading to a personalisation of the model. The implementation starts from a generic model, which is deformed in order to fit the input parameters. The input parameters must include the facial feature points, and may optionally include texture coordinates, and a calibration face model. We apply a cylindrical projection to the generic face in order to interpolate any missing feature points and to fit the texture, if supplied. Then we use a Dirichlet Free Form Deformation [Morcozet 97] interpolation method to deform the generic head according to the set of feature points. If the calibration face model is present the fitting method is based on cylindrical projections matching and barycentric coordinates to interpolate the non-feature points.