In this paper we present a somatosensory interaction manner based on the human skeletal information. Firstly, we use Kinect sensor to obtain image depth of field data, and capture the joint positions by skeletal tracking technology to match parts of the human body followed by further establishing the 3D coordinates of every joint of human body. Secondly, we utilize the 3D coordinates and offsets to calculate the action and rotation angle of human waist, shoulder, foot joints. Finally, through the middleware of Unity 3D and Kinect, the skeletal point data and RGB stream of the Kinect will be received. The test which the user's body directly control the Unity 3D model is realized.