1. Scaling Wearable Foundation Models
- Author
-
Narayanswamy, Girish, Liu, Xin, Ayush, Kumar, Yang, Yuzhe, Xu, Xuhai, Liao, Shun, Garrison, Jake, Tailor, Shyam, Sunshine, Jake, Liu, Yun, Althoff, Tim, Narayanan, Shrikanth, Kohli, Pushmeet, Zhan, Jiening, Malhotra, Mark, Patel, Shwetak, Abdel-Ghaffar, Samy, and McDuff, Daniel
- Subjects
Computer Science - Machine Learning ,Computer Science - Artificial Intelligence ,Computer Science - Human-Computer Interaction - Abstract
Wearable sensors have become ubiquitous thanks to a variety of health tracking features. The resulting continuous and longitudinal measurements from everyday life generate large volumes of data; however, making sense of these observations for scientific and actionable insights is non-trivial. Inspired by the empirical success of generative modeling, where large neural networks learn powerful representations from vast amounts of text, image, video, or audio data, we investigate the scaling properties of sensor foundation models across compute, data, and model size. Using a dataset of up to 40 million hours of in-situ heart rate, heart rate variability, electrodermal activity, accelerometer, skin temperature, and altimeter per-minute data from over 165,000 people, we create LSM, a multimodal foundation model built on the largest wearable-signals dataset with the most extensive range of sensor modalities to date. Our results establish the scaling laws of LSM for tasks such as imputation, interpolation and extrapolation, both across time and sensor modalities. Moreover, we highlight how LSM enables sample-efficient downstream learning for tasks like exercise and activity recognition.
- Published
- 2024