Fall Motion Detection with Fall Severity Level Estimation by Mining Kinect 3D Data Stream

Fall Motion Detection with Fall Severity Level

Estimation by Mining Kinect 3D Data Stream

Orasa Patsadu1, Bunthit Watanapa1, Piyapat Dajpratham2, and Chakarida Nukoolkit1

1School of Information Technology, King Mongkut’s University of Technology Thonburi, Thailand

2Faculty of Medicine Siriraj Hospital, Mahidol University, Thailand

Abstract: This paper proposes an integrative model of fall motion detection and fall severity level estimation. For the fall motion detection, a continuous stream of data representing time sequential frames of fifteen body joint positions was obtained from Kinect’s 3D depth camera. A set of features is then extracted and fed into the designated machine learning model. Compared with existing models that rely on the depth image inputs, the proposed scheme resolves background ambiguity of the human body. The experimental results demonstrated that the proposed fall detection method achieved accuracy of 99.97% with zero false negative and more robust when compared with the state-of-the-art approach using depth of image. Another key novelty of our approach is the framework, called Fall Severity Injury Score (FSIS), for determining the severity level of falls as a surrogate for seriousness of injury on three selected risk areas of body: head, hip and knee. The framework is based on two crucial pieces of information from the fall: 1) the velocity of the impact position and 2) the kinetic energy of the fall impact. Our proposed method is beneficial to caregivers, nurses or doctors, in giving first aid/diagnosis/treatment for the subject, especially, in cases where the subject loses consciousness or is unable to respond.

Keywords: Kinect 3D data stream, fall motion detection, fall severity level estimation, machine learning, smart home system.

Received August 25, 2015; accepted December 27, 2015
Read 1416 times Last modified on Thursday, 17 May 2018 05:44
Share
Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…