What it is: Apple has patented health monitoring features for a wearable headphone device such as its AirPods.
When most people think of wearable computers, they think of the Apple Watch. Yet Apple also sells AirPods, which appears to be nothing more than wireless headphones that let you listen to music from your iPhone, iPad, or iPod. If just listening to music were the whole purpose of AirPods, that might seem limited, but Apple may have more plans for AirPods that involve more health monitoring features.
The latest Apple patent reveals plans for measuring temperature, heart rate and perspiration levels through integrated sensors that relies on skin contact within the ear. That means AirPods are just another form of a wearable computer that can provide real-time health tracking. This latest patent seems to indicate that the future of wearable computers will focus on real-time health monitoring.
What’s also interesting about AirPods is that unlike the Apple Watch, AirPods lack a visual user interface. Instead, you can control AirPods through voice and touch commands. While people may complain about the tiny screen of the Apple Watch, making the screen larger isn’t the answer because it makes the Apple Watch more cumbersome to wear and use regularly.
The real key to wearable computers is to rely less on a visual user interface and more on touch gestures and voice commands. For people who believe the iPad can’t be useful since it lacks a mouse or trackpad, wearable computers like the Apple Watch and AirPod will further shatter this delusion since it’s impossible to use a mouse with the tiny Apple Watch screen or with AirPods.
The future of wearable computers is not only real-time health monitoring but a non-visual user interface that can be just as intuitive to use as the traditional mouse and keyboard visual user interface of pull-down menus and icons found on desktop computers.
User interfaces do not translate well to different form factors, which Microsoft found out twice the hard way. First, they tried to shoehorn the Start menu interface of Windows on to smartphones running Windows Mobile. When that didn’t work, they created the tile interface for Windows Phone and then tried to shoehorn that tile interface on to Windows 8.
In the meantime, Apple focused on graphical user interfaces for the Macintosh and touch user interfaces for the iPhone and iPad. Now for wearable devices, Apple will likely move to touch and voice commands with little or no visual user interfaces at all. Rather than try to cram the same user interface into different devices, it’s far more efficient to adapt a new user interface for each device.
So expect wearable computers to offer more real-time health monitoring features and more non-visual user interface features. If Microsoft wants to make another mistake, they can try cramming the Start menu Windows interface on to a wearable computer like their Microsoft Band (that they discontinued) or they can try cramming the tile interface on to their Microsoft Band as well. If that sounds ridiculous, it’s no more ridiculous than taking the tile interface of Windows Phone and putting it on Windows 8.
The mistakes of Microsoft and the patents of Apple show that one user interface cannot possibly work for all devices, which makes Microsoft’s strategy of putting Windows everywhere ultimately pointless. Optimum user interfaces for each device is the future and Apple’s latest patent shows that the optimum user interface for wearable computers won’t rely heavily on visual interfaces.