Google has an interesting new project page up for Soli, which promises to use tiny radar chips, which are smaller than a quarter, to accurately detect nearby finger motion — and then translate that into useful input. Google imagines users one day turning an imaginary know to turn down the volume, sliding their thumb across their index finger to navigate a map, and other gestures to extend the usable input area beyond the confines of a device itself. "The Soli chip can be embedded in wearables, phones, computers, cars and IoT devices in our environment," Google promises. And while their new landing page is focused on gestural readings, eager hackers found a variety of creative uses, such as trying to detect a type of liquid being poured (milk or water), 3D imaging, or a more natural music conducting interface for computers.
Other proposed uses were for smarter speakers that let you tune them without needing visible buttons:
Google seems particularly keen on exploring how the technology complements the Internet of Things, providing some user interaction without necessitating a screen or traditional input tools. The project reminded me a lot of FingerIO, a research project out of University of Washington: