« Back to Publications list
A Comparison between Audio and IMU data to Detect Chewing Events Based on an Earable Device
Abstract
The feasibility of collecting various data from built-in wearable sensors has enticed many researchers to use these devices for analyzing human activities and behaviors. In particular, audio, video, and motion data have been utilized for automatic dietary monitoring. In this paper, we investigate the feasibility of detecting chewing activities based on audio and inertial sensor data obtained from an ear-worn device, eSense. We process each sensor data separately and determine the accuracy of each sensing modality for chewing detection. We also measure the performance of chewing detection when fusing features extracted from both audio and inertial sensor data. We evaluate the chewing detection algorithm by running a pilot study inside a lab environment on a total of 5 participants. This consists of 130 minutes audio and inertial measurement unit (IMU) data. The results of this study indicate that an in-ear IMU with an accuracy of 95% outperforms audio data in detecting chewing and fusing both modalities improves the accuracy to 97%.
Citation
Roya Lotfi, George Tzanetakis, Rasit Eskicioglu, and Pourang Irani. 2020. A Comparison between Audio and IMU data to Detect Chewing Events Based on an Earable Device. In 11th Augmented Human International Conference (AH ’20), May 27–29, 2020, Winnipeg, MB, Canada. ACM, New York, NY, USA, 8pages. https://doi.org/10.1145/3396339.3396362
Authors
Roya Lotfi
AlumniPourang Irani
ProfessorCanada Research Chair
at University of British Columbia Okanagan Campus
As well as: George Tzanetakis, Rasit Eskicioglu