Regular monitoring of physical activities such as walking, jogging, sitting, and standing will help reduce the risk of many diseases like cardiovascular complications, obesity, and diabetes. Recently, much research showed that the effective development of Human Activity Recognition (HAR) will help in monitoring the physical activities of people and aid in human healthcare. In this concern, deep learning models with a novel automated hyperparameter generator are proposed and implemented to predict human activities such as walking, jogging, walking upstairs, walking downstairs, sitting, and standing more precisely and robustly. Conventional HAR systems are unable to manage real-time changes in the surrounding infrastructure. Improved HAR approaches overcome this constraint by integrating multiple sensing modalities. These multiple sensors can produce accurate information, leading to a better perception of activity recognition. The proposed approach uses sensor-level fusion to integrate gyroscope and accelerometer sensors. The analysis is carried out using the widely accepted benchmark UCI-HAR dataset. Based on several performance evaluation experiments, the classification accuracy of long short-term memory (LSTM), convolutional neural network (CNN), and deep neural network (DNN) classifiers is reported to be 96%, 92%, and 93%, respectively. Compared to state-of-the-art deep learning models, the proposed method gives better results.
Patil, Basamma Umesh; Ashoka, D V; and V, Ajay Prakash B.
"Data Integration Based Human Activity Recognition using Deep Learning Models,"
Karbala International Journal of Modern Science: Vol. 9
, Article 11.
Available at: https://doi.org/10.33640/2405-609X.3286
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.