Harrison and his team found that even when our fingers aren’t fully on the screen, the device still registers a weak connection. Our gadgets rightfully filter out these weak signals—after all, you don’t want to activate an app when your finger is an inch away from the screen—but it’s that seemingly hazy data that Harrison is interested in. “We said, hm, there’s something interesting there; let’s embrace the weak signal,” he says.
By measuring a finger’s angle relative to the screen’s surface, the phone is able to register the x- and y-rotation of a touch. This opens a whole new dimension for touchscreens, and creates a richer vocabulary of interactions we can use on our ever-shrinking screens.
So, that research Google X was doing on using short-wave radar1 to detect hand gestures far away from screens is… pointless, since the screens already do that?
- If I’m remembering right. It’s Thanksgiving Break, as I’m writing this, which means I am formally too lazy to check my sources. ↩