Recently, Google has unveiled its new pet project, codenamed Soli, which allows a user to interact with the virtual system using touchless gesture technology. It translates our precise finger movements into commands, that is recognized by gadget, using miniature radar, built into it. During its unveiling at Google I/O ATAP, presenters were able to turn the hours of a clock, by mimicking an imaginary dial movement.
The man behind this project, Ivan Poupyrev, stated in his interview, that it was always his passion to capture the movements of the human hand, the reason being its preciseness, and the diversity, it contains. The applications are limitless. The main idea is to use the radio frequency spectrum to track the twitches and micromotions of our hand and fingers and convert it to into digital commands. Soli sensor technology works via emitting electromagnetic waves in a broad beam. The system (known as full-gesture recognition pipeline) records the energy scattered by the objects, within the beam, with the help of a radar antenna, at a very high frame rate (10,000 fps). The reflected signals contain the characteristics of the object, which is processed by tailor-made radar sensing paradigm, that tracks specific gesture information, allowing it to convert those Doppler signals to actual Human intent. The advantage of using radar is that it has no moving parts, can work through objects, very precise because of millimeter-wave technology, and scaled down to a microchip, making it reliable and accurate.
So, now you can turn the volume up, or play/pause your next song, without even touching the device. Makes life easier, isn’t it? While this sounds very impressive, this isn’t the first time, gesture technology has been used, to create such gadgets. It has been in the market for years, from wired gloves to Leap Motion to Airwriting. We will have to wait and see, how far, can Google take this to. Development kits for Project Soli are expected to be shipped later this year.
To know more about it, Check here.