Augment your reality

Lynn Greiner, freelance IT journalist and regular contributor to InsightaaS
Lynn Greiner, freelance IT journalist and regular contributor to InsightaaS

Virtual reality (VR) seems to have captured peoples’ imaginations this year. We have Oculus Rift (owned by Facebook) and HTC Vive and other toys to play with. They’re cool, though, and can dump you completely into a virtual world with 360 degree views and changing perspectives, as though you’re (almost) actually there. VR games let combatants immerse themselves in combat, or fly spaceships, and VR simulators let pilots safely practice flying, and surgeons try out new techniques without risking patients’ lives. At the recent E3 show, it was reported that a vendor has even created virtual porn (the writer who reported on it was bemused, but didn’t seem too taken with the experience).

These wonderful gadgets are expensive, sometimes awkward to use, and require a ton of computing power. But there’s another form of “reality” that is also serving a practical purpose, and it’s nowhere near as pricey. It’s called Augmented Reality (AR).

Unlike VR, it doesn’t divorce you from the real world. Instead, AR overlays the physical environment with additional information, ranging from useful data about what you’re looking at to a superimposed gaming environment.

It’s been around for a while; University of Toronto professor and AR pioneer Steve Mann has been wearing some sort of AR device for, he says, over 30 years. Now the technology is trying to find a home in the real world. In 2012, the Royal Ontario Museum used AR to bring its dinosaur exhibition to life. Visitors could use the iPads in the exhibits, or bring their own Apple devices and download the AR app, to put flesh on the monsters when they pointed the camera at a skeleton, and to also receive information on how the beast had lived.

A simpler, but no less useful app arrived with Microsoft’s early Windows Phone 7. City Lens (now HERE City Lens) relies on the device’s GPS and other sensors, plus the camera, to overlay information about places in the user’s vicinity. As the user turns, the app senses what the camera is pointing at and updates the content. It requires an Internet connection; the volume of data necessary wouldn’t fit in a phone.

Actually, most augmented reality apps rely on a connection to external servers, on the Internet or elsewhere, for just that reason. The more comprehensive the app, the more data it needs. City Lens, for example, has to know all about entire cities, in detail, and that data resides in the cloud.

These applications are interesting, to be sure, but are not something a business would necessarily rely upon. However, the combination of the Internet of Things (IoT) and AR can create some extremely useful tools. Flowserve, National Instruments, Hewlett Packard Enterprise (HPE), and PTC collaborated on one example, an instrumented pump that uses AR to assist technicians.

It’s a converged IoT system, using HPE’s new Edgeline server onsite to pull in and analyse sensor readings from the pump. PTC’s analytics software runs on the Edgeline, takes those readings, and determines whether anything is going wrong, or is about to go wrong, and if so, notifies technicians. So far, it’s a pretty standard predictive maintenance setup – the system monitors, learns, and can predict component failures or potential malfunctions.

Once a technician comes out to deal with an issue, though, AR kicks in. The tech’s tablet contains an app that communes with the system – again, pretty standard stuff – but when he points the tablet’s camera at the pump, like magic, he can see critical readings overlaid on the appropriate spots, with malfunctions or out of spec readings highlighted.

Now suppose he has to disassemble or replace a component. The AR display can change to show how the pieces come apart (and, more importantly, how to put them together again). Right now, there’s the hassle of juggling the tablet while trying to manage tools and hardware, but the same technology could be used with a head mounted device that keeps the technician’s hands free to do the work.

This isn’t science fiction, this is real. HPE demonstrated a working system at its recent conference. And it’s not the only company putting big bucks into AR. GE has been working on the technology for several years, and its Connected Experience Lab is testing it out. The company demonstrated wearables use cases at Augmented World Expo (AWE) in 2014.

There were still challenges, however. GE found that the head mounted displays such as smart glasses, necessary for hands-free viewing, could be distracting, and the video quality was inadequate. Images took too long to render, or were fuzzy, making them impractical for technicians at the time. The best display medium, it concluded, was a smartphone or a tablet.

Osterhout Design Group demonstrated Reticle Android-powered smart glasses
Osterhout Design Group demonstrated Reticle Android-powered smart glasses

Wearable technology is improving at warp speed, however, and there are viable AR systems on the market today. At this year’s AWE, Osterhout Design Group demonstrated Reticle Android-powered smart glasses that won best in show, and the best enterprise solution award was captured by Flex Solutions, using the Atheer AiR Platform to replace paper picklists and handheld scanners in a warehouse environment with smart glasses that let users determine which items to pick, scan their barcodes, and complete the associated “paperwork” by a combination of eye tracking and gestures (think Minority Report).

Another implemented application for smart glasses has included checking shelves against the planograms in a retail environment, to ensure products were placed correctly.

In yet another case, one more practical on a tablet, an interior design app lets the user place virtual furniture in a real room captured by the tablet’s camera, then record the approved decor. It’s much easier on the back than moving real furniture. And again, it’s available now.

In fact, applications for AR are only limited by our imaginations. For example, advertisers have experimented with car ads in newspapers that revealed extra detail about the vehicle, and even showed videos and “walkthroughs”, when the user launched an app on phone or tablet and pointed the camera at the ad. That met with limited success at the time; most people had no idea how to use the app, or even what it was really for. And, let’s face it, those ads mostly came under the category of “because we can”.

However, transplant the same idea into an instruction manual, and the AR would make sense. And imagine an AR version of assembly instructions that could show where the user was going wrong when the camera viewed the actual item – it could prevent expensive and potentially dangerous mistakes.

As AR technology improves, we’ll likely see more places where it can be useful – possibly more so than VR. AR, if used properly, can be a helpful (and relatively inexpensive) addition to a business’s technology arsenal.