Sensor Applications for Human Activity Recognition in Smart Environments

No Thumbnail Available
Date
2020-11-17
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Human activity recognition (HAR) is the automated recognition of individual or group activities from sensor inputs. It deals with a wide range of application areas, such as for health care, assisting technologies, quantified-self and safety applications. HAR is the key to build human-centred applications and enables users to seamlessly and naturally interact with each other or with a smart environment. A smart environment is an instrumented room or space equipped with sensors and actuators to perceive the physical state or human activities within this space. The diversity of sensors makes it difficult to use the appropriate sensor to build specific applications. This work aims at presenting sensor-driven applications for human activity recognition in smart environments by using novel sensing categories beyond the existing sensor technologies commonly applied to these tasks. The intention is to improve the interaction for various sub-fields of human activities. Each application addresses the difficulties following the typical process pipeline for designing a smart environment application. At first, I survey most prominent research works with focus on sensor-driven categorization in the research domain of HAR to identify possible research gaps to position my work. I identify two use-cases: quantified-self and smart home applications. Quantified-self aims at self-tracking and self-knowledge through numbers. Common sensor technology for daily tracking of various aerobic endurance training activities, such as walking, running or cycling are based on acceleration data with wearable. However, more stationary exercises, such as strength-based training or stretching are also important for a healthy life-style, as they improve body coordination and balance. These exercises are not well tracked by wearing only a single wearable sensor, as these activities rely on coordinated movement of the entire body. I leverage two sensing categories to design two portable mobile applications for remote sensing of these more stationary exercises of physical workout. Sensor-driven applications for smart home domain aim at building systems to make the life of the occupants safer and more convenient. In this thesis, I target at stationary applications to be integrated into the environment to allow a more natural interaction between the occupant and the smart environment. I propose two possible solutions to achieve this task. The first system is a surface acoustic based system which provides a sparse sensor setup to detect a basic set of activities of daily living including the investigation of minimalist sensor arrangement. The second application is a tag-free indoor positioning system. Indoor localization aims at providing location information to build intelligent services for smart homes. Accurate indoor position offers the basic context for high-level reasoning system to achieve more complex contexts. The floor-based localization system using electrostatic sensors is scalable to different room geometries due to its layout and modular composition. Finally, privacy with non-visual input is the main aspect for applications proposed in this thesis. In addition, this thesis addresses the issue of adaptivity from prototypes towards real-world applications. I identify the issues of data sparsity in the training data and data diversity in the real-world data. In order to solve the issue of data sparsity, I demonstrate the data augmentation strategy to be applied on time series to increase the amount of training data by generating synthetic data. Towards mitigating the inherent difference of the development dataset and the real-world scenarios, I further investigate several approaches including metric-based learning and fine-tuning. I explore these methods to finetune the trained model on limited amount of individual data with and without retrain the pre-trained inference model. Finally some examples are stated as how to deploy the offline model to online processing device with limited hardware resources.
Description
Citation
Collections