SAWSense: Using Surface Acoustic Waves for Surface-bound Event Recognition
Yasha Iravantchi, Yi Zhao, Kenrick Kin, and Alanson Sample
Abstract
Enabling computing systems to understand user interactions with everyday surfaces and objects can drive a wide range of applications. However, existing vibration-based sensors (e.g., accelerometers) lack the sensitivity to detect light touch gestures or the bandwidth to recognize activity containing high-frequency components. Con- versely, microphones are highly susceptible to environmental noise, degrading performance. Each time an object impacts a surface, Sur- face Acoustic Waves (SAWs) are generated that propagate along the air-to-surface boundary. This work repurposes a Voice PickUp Unit (VPU) to capture SAWs on surfaces (including smooth surfaces, odd geometries, and fabrics) over long distances and in noisy environ- ments. Our custom-designed signal acquisition, processing, and ma- chine learning pipeline demonstrates utility in both interactive and activity recognition applications, such as classifying trackpad-style gestures on a desk and recognizing 16 cooking-related activities, all with >97% accuracy. Ultimately, SAWs ofer a unique signal that can enable robust recognition of user touch and on-surface events.