Instead of robots taking jobs, AR assistants may help humans do their jobs better! Let me share with you below an interesting complimentary article from Axway’s partner Propelics, specialized in enterprise mobile strategy and world-class mobile apps.
Propelics offers the latest in Humanoid Augmented Reality Assistants in our new Kickstart: Enterprise Contextual AR Assistance Strategy. Equip your employees with the very latest in Humanoid Augmented Reality technology, all for the low low cost of ten million dollars for this fixed-price, two-week engagement.
No, not really. But soon. Or you can wait until 2022 when Google will be giving away Humanoid Augmented Reality Friends for free: customize skin tone and regional accent or choose from a selection of famous actors and politicians.
No, not really. Not yet, anyway…
Much in the way that Clippit (aka “Clippy,” Microsoft’s universally despised ’90’s paper clip assistant) once annoyed us with its irrepressible helpfulness, as soon as augmented reality catches on and we’re all wearing AR-enabled devices, expect to see the introduction of virtual, lifelike, three-dimensional digital assistants that (who) interact with us on a regular basis and act as our escorts to the digital realm.
At first we may find these helpers to be just as annoying as the original paper clip assistant. But unlike ol’ Clippy, these assistants may actually prove helpful. Before too long we may even grow to like them—dare I say even depend on them—just as much as we depend on our smartphones today. Because as well as knowing everything about the world, the big differentiator is these creepy beings will already know everything about us. They’ll know your preferences, they’ll know your history (shopping and otherwise), they’ll know your schedule, and they’ll know your current and past locations. They’ll likely even be able to predict where you’re going.
In short, they’ll know what you like, where you’ve been and where you’re going. Chances are they’ll also know how you’re feeling. Depending on the array of sensors we choose to adorn ourselves with, they may also be aware of a whole host of our physical conditions (body temp, heart rate, blood O2, blood sugar, etc.).
So in about five years, expect to engage your digital AR assistant in predictive interactions like this: “I see you’re taking someone to see Fast & Furious 27 tonight. I can tell by your heart rate you’re excited. Is this a date? Would you also like me to reserve a table at a nearby restaurant? I know how much you love Mexican. And hey, I just found a Groupon from a place not too far from the theater. Want me to buy it for you? Or maybe you’d like something fancier? Oh, and should I go ahead and schedule an Uber for the ride there and back? That way no one has to be the designated driver, if you know what I’m saying.”
Don’t think so? Well, let me ask you this: five years ago, did you think you’d be interacting verbally with an intelligent digital assistant every day? Engaging in conversation with Siri, Alexa, and whatever that Google Home lady is called? That all happened real fast, didn’t it? For the record, the 1st generation Echo became widely available in the US on June 23, 2015.
“But Steve,” you ask, “Why would I need a person to act as an intermediary between us and our dinner plans when a mobile app could already accomplish all of the above?” Simple. Because somebody’s got to help us figure out all this augmented reality stuff and a virtual human guide is the most natural and convenient way of going about it. Why do you think they already have hologram-like projections of virtual assistants at airports? Not just because they’re cool or attention-grabbing but because interacting with people is what we’re all used to doing. For that matter, why do you think Microsoft thought they needed to add a face to a paper clip?
It’d be weird if from day one we kept getting inundated with notifications and informational popups in our AR device, but if an attractive digital humanoid was doing the talking? I’m pretty sure we’d all be more inclined to interact. Talk about user engagement!
In a way, this phase will represent another manifestation of the early skeuomorphic period of mobile UI/UX design. Remember way back in 2010 when everything on your phone looked like it was made from real-world materials? Metal, wood, paper, etc. That was to create familiarity with the interface, easing the transition to a handheld digital device by helping users understand the intent of all those buttons and switches. Today, the same buttons are simple, flat rectangles. Similarly (and perhaps ironically) I predict the first generation of humanoid assistants will be the most graphically lifelike. Later generations will streamline the presentation until all that’s left is a disembodied voice and some floating arrows.
Question is, are we all going to walk around looking like we’re talking to ourselves all the time. Yes. We are. How can I declare this with such confidence? Because we already do, thanks to those tiny Bluetooth earbuds (though to be fair, nobody actually talks on their smartphone anymore).
I originally addressed this idea three years ago, here, but with the widespread adoption of AR, I can really see this thing coming to light.
But there’s more. In addition to having our own virtual AR assistants, we can also expect to see real (or bots indistinguishable from virtual renderings) corporate-sponsored people appearing in our AR field of view. Need help in a real physical space—say, when shopping in a store or when walking in a bank or enjoying a museum? Virtual assistants—provided by the place of business—will be there to help. Eventually, I expect these to become as commonplace as live help chat is now on most commercial websites.
And from a technical perspective, a 3D AR avatar is not far off. In fact, it’s already here. Check out this TED talk about the Microsoft HoloLens featuring a “holographic projection” in which a dude standing in a studio across the street is ‘beamed’ virtually onto the TED talk stage while—for good measure—also standing on a digital reproduction of the surface of Mars (as captured by Rover). Granted, the technology demonstrated is neither holographic nor a projection, but the basic premise is there: live interaction with a 3D representation of a real person who is physically located somewhere else.
Or watch this totally adorable “holoportation” demo that further explains how interacting virtual 3D representations of remote people could be helpful not only because it lets us hangout with our kids more often but because these ‘holograms’ (totally not holograms) will also be able to interact with our environment “as if they’re co-present.” Wondering how to use that new TV? The Panasonic AR Assistant will teach you how it works by actually pointing at things on your TV, assuming you have the proper “capture rig” (camera setup) in your home environment, which I’m assuming we’ll all have pretty soon. But even without the capture rig, a virtual 3D AR assistant could literally point out the most popular item on the menu (in the restaurant), or point the way to the nearest Starbucks and then literally walk us there, ordering our favorite drink along the way so it’s ready and waiting when we arrive. Humanoid AR assistants could teach us how to dance, instruct us in yoga, even provide real-time therapy by encouraging us to face our phobias or push our workouts harder. The possibilities are endless.
So get ready. Once Augmented Reality achieves universal adoption (or something close to it), everything’s gonna change. At this point it’s just a matter of waiting until somebody finally figures out how to make a pair of AR glasses that Geordi La Forge wouldn’t be embarrassed to wear. In the meantime, if you’d like to help your company start enjoying the many benefits of pushing the technological envelope, give us a call. Our Emerging Technologies Kickstart may be just the boost your business has been looking for.