A funny thing happened about a month ago. First, Intel made an announcement that it was gutting its R&D budget for wearables and going all-in on Augmented Reality (AR). At almost exactly the same moment, Google (who has been conspicuously silent on the subject for some time) made an announcement singlehandedly resurrecting their Glass project, and shifting's its focus to Enterprise applications—specifically in the manufacturing and healthcare industries. Then a day later, Lenovo made a big announcement around AR and hologram technology. And then finally a similar note from Microsoft, who is using onboard Artificial Intelligence chips (AI) to process real-time objects via HoloLens, their AR platform.
What's going on here?
For one thing, we're seeing a moment where the technology itself (localized processing and onboard AI) is driving toward an experience that feels more seamless and less gimmicky to the end-user than previous iterations. Another consideration is that the volume and quality of data that is available to bring a meaningful AR experience to life, is starting to become more normalized. Why would Google focus on enterprise for Glass and why now? Think about the applications for Manufacturing and Healthcare…The number of product SKUs that you might be combining for a custom order (industrial manufacturing) or the volume of diagnosis and vitals data associated with a patient record (and the individual products or therapies tied to a patient's case)—parsing all of this data takes time and is error-prone. If we were able to visualize data and have onboard AI help parse insights, a healthcare worker could focus more on the critical indicators relevant to a patient case, and not the aspects of the treatment or visit that are not relevant to the situation. This is about context and creating focus through the noise. Gizmodo's article made a good observation:
"Now the Enterprise Edition targets manufacturing workers and medical professionals who can use Glass to reference medical information and record and transcribe notes when they’re with patients. So that means you probably won’t be able to get your hands on a pair of Glass anytime soon—unless your company makes you use it for work." - Gizmodo
A year or two back, everyone from the Verge to the New York Times were writing about very advanced applications for AR technology in the clinical space. Using augmentation to help find a patient's veins more easily, or to remotely assist a new mother with the stressful process of breast feeding. E-learning is another large area where there is lots of clinical application potential, but again—the last generation of hardware was still generation 1, and the data, software, and mass consumer imagination for the tech was still in its infancy.
"If you can't solve a common pain-point with a new tool, no one will use it."
What we know from years of emerging technology trends is that to create wide adoption, the platform must solve a common problem. We see this in enterprise applications we build for clients every single day. If you can't solve a common pain-point with a new tool, no one will use it. So much of the common technology in our homes exist… whether microwaves, remote controls (infrared) or now even drones, came from the military or industrial applications where they were being specifically used. This created an environment where manufacturing and production went through many iterations to ultimately raise quality and lower cost. That is the fertile ground you need for mass-scale consumer applications to take root. So, it is no surprise that e-learning and remote nursing support isn't where AR is meeting the most need for Healthcare. Take the task of co-mingling complex data sets and build very friendly KPIs or assessment levels for healthcare workers? That is a massive adoption opportunity.
The other side of that coin is cost and risk. We hear every day about someone going into the hospital for a treatment and then having life-risking complications due to negligence, secondary infection, etc. Hospital systems are complex. They treat thousands of patients every day and let's face it, human error happens. If you could put the power of real-time, visualized data in the hands of healthcare workers, the opportunity to catch risks before they turn into issues goes up—and that translates to cost saving for the hospital. From increasingly harsh regulatory sanctions for quality care and lowering re-admittance to malpractice suits and negative press, the system itself has a lot to gain from arming clinical staff with the ability to see something trending in the wrong direction.
A Wish-list for the Future
If I could make a wish-list for the future—the top of it would be a simple timer associated with each patient room that indicates how long they've been waiting to be discharged, once that process has been initiated. If you've ever been in the hospital overnight, you'll know exactly what I'm talking about.
Discharge, from the moment it is mentioned to you as the patient to the moment it actually happens, can be anywhere from one to ten hours. Typically, your attending nurse practitioner is just waiting on the primary care physician to review the final chart and add their signature. I dream of a future where the longer someone has been waiting, the larger and deeper the visual KPI glares at the passing observer. If you could have that pop-up in someone's periphery as they walk by the room… and even better, if every single clinical staff member had the same access to that data as they pass the room—the opportunity for someone to escalate or intervene exponentially increases.
It will be exciting to watch how this technology starts weaving its way into the common threads of our lives. I hope healthcare is where the innovations catch hold first. The implications are huge, and we all stand to gain from the next generation of this technology integrating more fluidly into our daily lives.