Influenza-like Symptom Recognition using Mobile Sensing and Graph Neural Networks
Guimin Dong, Lihua Cai, Debajyoti Datta, Shashwat Kumar, Laura E. Barnes, and Mehdi Boukhechba (University of Virginia)
View paper in the ACM journal
Abstract: Early detection of influenza-like symptoms can prevent widespread flu viruses and enable timely treatments, particularly in the post-pandemic era. Mobile sensing leverages an increasingly diverse set of embedded sensors to capture fine-grained information of human behaviors and ambient contexts and can serve as a promising solution for influenza-like symptom recognition. Traditionally, handcrafted and high level features of mobile sensing data are extracted by using handcrafted feature engineering and Convolutional/Recurrent Neural Network respectively. However, in this work, we use graph representation to encode the dynamics of state transitions and internal dependencies in human behaviors, apply graph embeddings to automatically extract the topological and spatial features from graph input and propose an end-to-end Graph Neural Network model with multi-channel mobile sensing input for influenza-like symptom recognition based on people's daily mobility, social interactions, and physical activities. Using data generated from 448 participants, We show that Graph Neural Networks (GNN) with GraphSAGE convolutional layers significantly outperform baseline models with handcrafted features. Furthermore, we use GNN interpretability method to generate insight (important node, graph structure) for the symptom recognition. To the best of our knowledge, this is the first work that applies graph representation and graph neural network on mobile sensing data for graph-based human behaviors modeling.