What it is: Apple recently patented a way for a computer to detect hand gestures.
During the game console wars between Microsoft, Sony, and Nintendo, Nintendo scored a major coup when they released the Wii, which let players control a game by holding a hand control. Moving the hand control moved objects on the screen such as mimicking the swing of a baseball bat or tennis racket. The Wii helped make games more immersive. Then Microsoft leapfrogged Nintendo when they used gesture recognizing technology from a company called PrimeSense, which helped Microsoft crate the Kinect motion detector.
By using Kinect, players could play games without holding on to anything at all. The Kinect device detected a player’s movement and translated that into the video game such was a dance video game where players danced in their living room floor. Yet for some odd reason, Microsoft let the Kinect fade into obscurity even though surgeons were using the Kinect to view medical images that they couldn’t physically touch without contaminating their hands during surgery. Microsoft even introduced a software developer’s kit (SDK) to help developers create programs that took advantage of the Kinect, but the Kinect seems to have faded from Microsoft’s attention until it’s essentially dead.
Curiously, Apple acquired PrimeSense, the company behind the Kinect, and has patented a hand gesture recognizing device for translating hand gestures into controlling a computer. Most likely this will be used to do exactly what people used the Kinect to do on PCs. The only difference is that now Apple will introduce this technology to the world that Microsoft introduced first. The only difference is that Apple will likely offer compelling reasons to use such motion detecting technology while Microsoft did not.
Right now, the iPhone X cameras and sensors can scan a person’s face but it’s only a matter of time before such sensors and cameras can detect motion gestures. Such hands-free computing might seem pointless on a Macintosh or iPad, but it could be perfect for the HomePod.
Remember, the HomePod is essentially introducing hands-free computing, so in addition to controlling it by voice, you might also be able to control it with hand gestures as well. Rumors have circulated about HomePod eventually adding a screen so you can see something, and what better way to control a screen than through hand gestures instead of voice?
The paradigm for user interfaces is simple:
- Macintosh – keyboard and mouse through a graphical user interface
- iPhone and iPad – touch screen
- Apple Watch – sensors touching the skin and a tiny touch screen
- Apple TV – swiping gestures on a remote control
- HomePod – voice recognition and possibly hand gestures
Using hand gestures on an iPhone will likely be pointless because you’re typically holding an iPhone so you might as well touch it. However, hand gestures can be useful for the Apple TV or HomePod since you’re likely be away from each device instead of touching it like an iPhone or iPad.
Hand gestures aren’t going to replace other forms of user interfaces like keyboards and touch screens, but they will likely work perfectly for niche applications like the HomePod or Apple TV. Whatever the case, motion detecting technology will likely find a way to mainstream products in the next few years but look for it first on future models of HomePod.