What it is: Augmented reality requires software development kits (SDKs) to work.
There’s an old website called OnlyAugmented.com out of Australia. While the site seems like it’s been dormant since 2014, it still contains a lot of interesting information about augmented reality. One interesting bit of information on this site involves how augmented reality works for tracking.
Basically augmented reality lets you point a smartphone or tablet camera at something real and see something virtual superimposed over that scene. That’s how Pokemon GO works in letting you looking for Pokemon characters in your neighborhood. Some of the different ways to create augmented reality include:
- Sensory: GPS (location based), Compass (navigational)
- Sensory: Gyroscope (angle of entry), Accelerometer (speed of entry)
- Image based, near field: pre-trained (or pre-laoded imagery), cloud based (talking to a database over wireless) or user-defined (programs like Metaio Creator allow you to define your own tracking image).
- 3D based: otherwise known as “Point Cloud“. In short, this is the XYZ coordinates of an object which has been scanned (usually by 3D scanner, or camera) and all it’s points (all it’s coordinates) mapped onto a Point Cloud. These can then be sued to create Augmented Reality tracking, but also 3D CAD models for manufactured parts, metrology/quality inspection, and a multitude of visualization, animation, rendering and mass customization applications.
- Live 3D based: This system is borrowed from the world of robotics and called simultaneous localization and mapping (SLAM). It is a technique used by robots and autonomous vehicles to build up a map within an unknown environment (without a prior knowledge), or to update a map within a known environment while at the same time keeping track of their current location.
- Human extremities: facial tracking (including recognition + tracking), fingers and bodies. No limit here!
If you look at the chart above, you can see that many augmented reality technologies rely on NFT, which stands for Natural Feature Tracking. This lets you point a camera at different items and NFT can place a virtual image over a specific area of that image. In contrast to NFT, there’s also marker tracking where you must specifically place a marker on an item so an augmented reality device can identify the marker to superimpose an image over it.
Obviously marker tracking is impractical in most cases because few places will let you slap a marker on it so augmented reality devices can recognize it. If you look at what Metaio used to offer, you can get a hint at what Apple will offer in the future because Apple acquired Metaio.
What Apple needs to do to get augmented reality to work on the iPhone/iPad is to make sure it can accurately identify areas to superimpose virtual images. For example, if you point your smartphone camera at a street, you want to see that street name appear superimposed over the actual street image, regardless of the angle you aim it or what lighting conditions might be such as a sunny, cloudy, or rainy day.
Besides getting augmented reality to work correctly, Apple also needs to refine the software development kits (SDKs) for letting developers create augmented reality. So if you put up a billboard of a man’s face, you might want augmented reality to allow people to see that man’s face talk and provide a short sales message of some kind. So not only must augmented reality work correctly with a camera, but it must be easy and reliable to create for someone who wants to create augmented reality for others to view.
Augmented reality is coming soon because it offers a host of practical and interesting applications. Maybe it will come next year or maybe it will take longer, but it’s coming. The real question is how well can Apple make it work so it’s useful right from the start?