..
Suche
Hinweise zum Einsatz der Google Suche
 

Towards Breathing as a Sensing Modality in Depth-Based Activity Recognition

Jochen Kempfle and Kristof Van Laerhoven


Description

Depth imaging has, through recent technological advances, become ubiquitous as products become smaller, more affordable, and more precise. Depth cameras have also emerged as a promising modality for activity recognition as they allow detection of users’ body joints and postures. Increased resolutions have now enabled a novel use of depth cameras that facilitate more fine-grained activity descriptors: The remote detection of a person’s breathing by picking up the small distance changes from the user’s chest over time. We propose in this work a novel method to model chest elevation to robustly monitor a user’s respiration, whenever users are sitting or standing, and facing the camera. The method is robust to users occasionally blocking their torso region and is able to provide meaningful breathing features to allow classification in activity recognition tasks. We illustrate that with this method, with specific activities such as paced-breathing meditating, performing breathing exercises, or post-exercise recovery, our model delivers a breathing accuracy that matches that of a commercial respiration chest monitor belt. Results show that the breathing rate can be detected with our method at an accuracy of 92 to 97% from a distance of two metres, outperforming state-of-the-art depth imagining methods especially for non-sedentary persons, and allowing separation of activities in respiration-derived features space.

Download

The original data as used in the paper can be downloaded via this link: SAR'20 dataset (6.5 GB)

Citation and more information

Towards Breathing as a Sensing Modality in Depth-Based Activity Recognition, Jochen Kempfle and Kristof Van Laerhoven. Sensors, 20(14), 2020.

Disclaimer

You may use this data for scientific, non-commercial purposes, provided that you give credit to the owners when publishing any work based on this data.