•  
  •  
 

Abstract

Regular monitoring of physical activities such as walking, jogging, sitting, and standing will help reduce the risk of many diseases like cardiovascular complications, obesity, and diabetes. Recently, much research showed that the effective development of Human Activity Recognition (HAR) will help in monitoring the physical activities of people and aid in human healthcare. In this concern, deep learning models with a novel automated hyperparameter generator are proposed and implemented to predict human activities such as walking, jogging, walking upstairs, walking downstairs, sitting, and standing more precisely and robustly. Conventional HAR systems are unable to manage real-time changes in the surrounding infrastructure. Improved HAR approaches overcome this constraint by integrating multiple sensing modalities. These multiple sensors can produce accurate information, leading to a better perception of activity recognition. The proposed approach uses sensor-level fusion to integrate gyroscope and accelerometer sensors. The analysis is carried out using the widely accepted benchmark UCI-HAR dataset. Based on several performance evaluation experiments, the classification accuracy of long short-term memory (LSTM), convolutional neural network (CNN), and deep neural network (DNN) classifiers is reported to be 96%, 92%, and 93%, respectively. Compared to state-of-the-art deep learning models, the proposed method gives better results.

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Share

COinS