Connect with us


Google’s newly FCC-approved Project Soli could allow for new levels of gesture control

Hey Google, which finger am I holding up?

google project soli demonstration
Image: Google

One of the three laws laid down by esteemed sci-fi writer Arthur C. Clarke is that “Any sufficiently advanced technology is indistinguishable from magic.” What could be more magical than waving your hands around to control things?

Forget fiddly runes, smelly reagents, and other spell components, gestures are where it’s at. If Google’s Project Soli ever sees a commercial release, that brand of magic will be mainstream.

Project Soli and its FCC approval

The crucial step for Google came late on Tuesday when the FCC granted a waiver to let them operate the sensors involved at higher power ranges than before. That presumably will enable room-scale navigation where the user doesn’t need to be near the device that they’re operating. The FCC also granted Google permission to operate Soli devices onboard aircraft, providing they were still used within the existing Federal Aviation Administration rules for portable electronic devices.

The FCC granted this expansion of testing due to it being in the public interest. Imagine this tech put into assistive technologies for users with mobility or other impairments. How about mundane things like light switches or taps on a faucet in heavily-trafficked areas? The sensors in Project Soli could reduce the risk of disease spreading in hospitals or other public areas.

Project Soli has been around since 2015 when Google first showed off the gesture-based navigation tech, which uses radar waves to detect small finger movements to navigate around a UI. The Minority Report-style tech is already shrunk down to a size that could be used in Android smartwatches, solving one of the main usability issues with the small screens on wrist-mounted tech.

How Soli differentiates itself from other gesture-based tech

In contrast to current gesture-based tech that requires large movements to work, Project Soli can pick up even minor shifts in the position of fingers.

This means it should let you do things like change volume by rubbing your forefinger and thumb together or pressing an individual button by pressing a fingertip to your thumb. Adding the physical feedback of your own hands touching makes the completely virtual controls feel real, aiding in their use.

I really hope that Project Soli sees the light of day, the gesture-controlled future shown in multiple sci-fi shows looks amazing. I won’t hold my breath though, as Google has a pretty dismal track record for closing long-running projects. Maybe the rebranding and continuation of Google Fi has broken that trend.

What do you think of Project Soli? Let us know down below in the comments or carry the discussion over to our Twitter or Facebook.

Editors’ Recommendations:

Follow us on Flipboard, Google News, or Apple News

Maker, meme-r, and unabashed geek with nearly half a decade of blogging experience at KnowTechie, SlashGear and XDA Developers. If it runs on electricity (or even if it doesn't), Joe probably has one around his office somewhere, with particular focus in gadgetry and handheld gaming. Shoot him an email at joe@knowtechie.com.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in News