Human Gesture Recognition using Keyframes on Local Joint Motion Trajectories


INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, cilt.8, ss.131-136, 2017 (ESCI İndekslerine Giren Dergi) identifier

  • Cilt numarası: 8 Konu: 4
  • Basım Tarihi: 2017
  • Sayfa Sayıları: ss.131-136


Human Action Recognition (HAR) systems are systems that recognize and classify the actions that users perform against the sensor or camera. In most HAR systems, an input test data is compared with the reference data in the database using various methods. Classification process is performed according to the result obtained. The size of the test or reference data directly affects the operation speed of the system. Reduced data size allows a significant performance increase in system operation speed. In this study, action recognition method is proposed by using skeletal joint information obtained by Microsoft Kinect sensor. Splitting keyframes are obtained from the skeletal joint information. The keyframes are observed as a distinguishing feature. Therefore, these keyframes are used for the classification process. Keeping the keyframes instead of keeping the position or angle information of action in the reference database can benefit from memory and working time. The weight value of each keyframes is calculated in the method. The problem of temporal differences that occur when comparing test and reference action is solved by Dynamic Time Warping (DTW). The k-nearest neighbor's algorithm is used for classification according to the obtained results from DTW. The sample has been tested in a data set so that the success of the method can be tested. As a result, 100% correct classification was achieved. It is also suitable for working at real time systems. Breakpoints can also be used to provide feedback to the user as a result of the classification process. The magnitude and direction of the keyframes, the change in the trajectory of joint, the position and the time of its existence also give information about the time errors.