Abstract:
Advancements in sensor technology, expanded analytical skills, and advancement in the field of Machine Learning (ML) and Deep Learning (DL), have all resulted in a substantial increase in popularity and wearable sensor performance for Human Activity Recognition (HAR). The real challenge is to spot the events in an unsupervised environment. In this study, we have tried to build a user-friendly, as well as an effective HAR system using an inertial sensor for eight everyday activities performed. The data was collected by our research team, using a single inertial sensor in a fully unsupervised setup. The eight tasks include: standing, sitting, sleeping, running, walking, cycling, upstairs and downstairs. This paper aims to present a detailed analysis and comparison for three primary aspects of a general HAR which contributes to the overall system performance. This involves analyzing the effects of pre-processing, comparing several extraction and selection methods for generating features from time-series data, and finally building and validating the performance of various classification methods to obtain the best combination of the three. The classification methods included in this study are Logistic Regression, K-Nearest Neighbors, Support Vector Machines and Artificial Neural Networks. After choosing the best parameters and techniques, we achieve a remarkable performance for recognizing the eight activities with an overall accuracy of 93.6%.