A Novel Recurrent Neural Networks Architecture for Behavior Analysis

A Novel Recurrent Neural Networks

Architecture for Behavior Analysis

Neziha Jaouedi1, Noureddine Boujnah2, and Mohamed Bouhlel3

1Electrical Engineering Department, Gabes university, Tunisia

1,3SETIT Lab, Tunisia

2Faculté des sciences de Gabes, Tunisia

Abstract: Behavior analysis is an important yet challenging task on computer vision area. However, human behavior is still a necessity in differents sectors. In fact, in the increase of crimes, everyone needs video surveillance to keep their belongings safe and to automatically detect events by collecting important information for the assistance of security guards. Moreover, the surveillance of human behavior is recently used in medicine fields to quickly detect physical and mental health problems of patients. The complex and the variety presentation of human features in video sequence encourage researches to find the effective presentation. An effective presentation is the most challenging part. It must be invariant to changes of point of view, robust to noise and efficient with a low computation time. In this paper, we propose new model for human behavior analysis which combine transfer learning model and Recurrent Neural Network (RNN). Our model can extract human features from frames using the pre-trained model of Convolutional Neural Network (CNN) the Inception V3. The human features obtained are trained using RNN with Gated Recurrent Unit (GRU). The performance of our proposed architecture is evaluated by three different dataset for human action, UCF Sport, UCF101 and KTH, and achieved good classification accuracy.

Keywords: Deep learning, recurrent neural networks, gated recurrent unit, video classification, convolutional neural network, behavior modelling, activity recognition.

Received December 29, 2018; accepted January 19, 2020

https://doi.org/10.34028/iajit/18/2/1
Full text     
Read 1147 times Last modified on Wednesday, 24 February 2021 04:28
Share
Top
We use cookies to improve our website. By continuing to use this website, you are giving consent to cookies being used. More details…