Google’s newly FCC-approved Project Soli could allow for new levels of gesture control
Hey Google, which finger am I holding up?
One of the three laws laid down by esteemed sci-fi writer Arthur C. Clarke is that “Any sufficiently advanced technology is indistinguishable from magic.” What could be more magical than waving your hands around to control things?
Forget fiddly runes, smelly reagents, and other spell components, gestures are where it’s at. If Google’s Project Soli ever sees a commercial release, that brand of magic will be mainstream.
Project Soli and its FCC approval
The crucial step for Google came late on Tuesday when the FCC granted a waiver to let them operate the sensors involved at higher power ranges than before. That presumably will enable room-scale navigation where the user doesn’t need to be near the device that they’re operating. The FCC also granted Google permission to operate Soli devices onboard aircraft, providing they were still used within the existing Federal Aviation Administration rules for portable electronic devices.
The FCC granted this expansion of testing due to it being in the public interest. Imagine this tech put into assistive technologies for users with mobility or other impairments. How about mundane things like light switches or taps on a faucet in heavily-trafficked areas? The sensors in Project Soli could reduce the risk of disease spreading in hospitals or other public areas.
Project Soli has been around since 2015 when Google first showed off the gesture-based navigation tech, which uses radar waves to detect small finger movements to navigate around a UI. The Minority Report-style tech is already shrunk down to a size that could be used in Android smartwatches, solving one of the main usability issues with the small screens on wrist-mounted tech.
How Soli differentiates itself from other gesture-based tech
In contrast to current gesture-based tech that requires large movements to work, Project Soli can pick up even minor shifts in the position of fingers.
This means it should let you do things like change volume by rubbing your forefinger and thumb together or pressing an individual button by pressing a fingertip to your thumb. Adding the physical feedback of your own hands touching makes the completely virtual controls feel real, aiding in their use.
I really hope that Project Soli sees the light of day, the gesture-controlled future shown in multiple sci-fi shows looks amazing. I won’t hold my breath though, as Google has a pretty dismal track record for closing long-running projects. Maybe the rebranding and continuation of Google Fi has broken that trend.
- Google removed the option to revert to the old Chrome UI and some people aren’t happy about it
- Project Fi becomes Google Fi and now works with most iPhones
- Patents show smart clothing could be a thing in Apple’s future
- Facebook ignored its own privacy rules in secret deal with Amazon
- The tiny PIQO projector features high definition picture quality and apps like Netflix