Will smart glasses fog when I drink my morning coffee?
So many questions about the future of smart glasses remain unanswered.
As we slouch toward the end of the smartphone era, it’s important to consider what comes next — and plan accordingly.
Here are the questions technology professionals like us should be asking, with the answers we know so far (many of which have emerged in the past week).
Q: Will Apple make smart glasses?
I told you in January why Apple needs to make smart glasses in order to stay on top. In a few years, smart glasses will be at the center of consumer and business electronics in the same way that smartphones are today.
Apple CEO Tim Cook is on record as saying that smart glasses could be as transformative as smartphones were. The company has many smart-glasses patents. The company has hundreds of engineers working on it.
Bloomberg this week reported that Apple has been working on a smart-glasses project under the internal name "T288." The glasses would run an iOS-derived operating system called rOS (more on that below). The article said Apple could ship its first smart-glasses product within three years. (That’s optimistic. Five years is more realistic, in my guesstimation.)
Apple’s project is working toward stand-alone glasses, which don’t require a smartphone as the "engine" and screen, according to the report. (The article isn’t specific about whether Apple’s smart-glasses project will enable stand-alone operation and connectivity, as the Apple Watch Series 3 with GPS + Cellular can.)
Q: Will Apple make smart glasses for the enterprise?
As with all end-user devices, Apple tends to make all-purpose computers and devices that serve both consumer and enterprise use cases, so it’s likely that Apple’s smart glasses will find integrators, developers and customers in the enterprise space. Apple would likely seek common use cases, and leave specialization to smaller companies.
Q: Are people going to wear smart glasses while walking around in public, or will they be obtrusive "goggles" or Google Glass-like devices that aren’t socially acceptable?
The answer is: yes.
Every imaginable variant of smart glasses will become available. These will shake out into two general categories of smart glasses — those that look like ordinary glasses that you wear all the time and others that are obtrusive "goggles" that you put on and use actively.
The hardest problem in smart glasses is making them look like regular glasses. In fact, ordinary glasses that offer augmented reality are at least 10 years away.
Q: When will smart glasses arrive?
Smart glasses have been around for years. They’re here now. For example, a search on Amazon.com for "smart glasses" returns 64 results.
None of these consumer products is ready for super-wide adoption. But they do exist.
Enterprise smart glasses are more advanced. Bulky, obtrusive smart glasses are also already on the market and new ones are coming out all the time. A few products have come out just this week.
DAQRI this week began shipping its DAQRI Smart Glasses product both directly and also through channel partners. They targeted the product at manufacturing, field services, maintenance and repair, inspections, construction, and other applications.
Olympus this week introduced its EyeTrek Insight EI-10, open-source smart glasses for the enterprise. The glasses have a swappable battery system, which partly compensates for low battery life of around an hour.
Also this week, NEC introduced a new platform for smart glasses development based on its ARmKeypad technology.
These new products join the long list of existing or in-development enterprise smart-glasses products from companies liksuch ase Microsoft, IBM, Sony, Vuzix, Meta, Epson Toshiba,
Lumus, MicroVision, Penny, Brother, Konica Minolta, Fujitsu, Optinvent, Augmented Vision, Atheer GlassUp, Telepathy, Laster, Innovega, Trivisio, Baidu, LAFORGE Optical and many others.
Q: Does Google Glass have a future?
Google is ahead of the market in lightweight, heads-up display glasses. Companies such as Boeing have been using the original version of Google Glass for manufacturing.
The new and greatly improved Enterprise Edition shipped last month.
And a new Google patent emerged this week showing a version of Google Glass with screens embedded in both lenses directly (no more "boom" pouring light directly into one eye). Unlike the current model, which offers heads-up display capability only, the patent could bring Glass into the augmented reality space.
Q: What will the primary purpose of smart glasses be?
The purpose of smart glasses will be to place computer-generated content in the user’s field of view or ears.
This content includes augmented reality, mixed reality, virtual reality, 360 video, heads-up displays and contextual audio.
Instead of looking at a rectangular screen, we’ll see words and pictures and objects and virtual environments by simply looking around.
As I mentioned previously in this space, all these use cases can blur together into similar experiences — or will be experienced à la carte, as the app or use case (or device) demands.
(It’s telling that Apple calls its smart glasses platform rOS, which almost certainly stands for "Reality Operating System." Apple appears to be rolling up all the stuff that smart glasses can do into the single word "reality.")
The important point is that smart glasses won’t be one thing, there will be many kinds of smart glasses delivering many kinds of experiences.
Q: What will the first mainstream, everyday smart glasses be like?
While bulky enterprise devices are already coming online, we can also look forward to smart glasses we wear every day all day. In other words, "glasses" — prescription glasses, reading glasses and sunglasses — will gain smart options.
"Smart frames" will be offered for sale at the optometrist’s office.
I believe the first big product in this space will come from Amazon.
That company has led the industry in churning out appliances that deliver its Alexa virtual assistant. Bone-conduction glasses that let you speak to Alexa and get answers through either bone conduction or a combination of bone conduction and tiny speakers are an obvious next step for Amazon, and it has a patent for that.
It’s also telling that Babak Parviz, the founder of Google Glass, now works for Amazon.
The FT newspaper reported that Amazon is in fact working such a product or line of products, and these could ship within the next two months. If that doesn’t happen this year, I believe Alexa glasses will ship next year at the latest.
Amazon may lead a pack of companies offering minimalist audio-only or audio-plus-blinking-light-notification-type smart glasses next year. Over time, ordinary, everyday smart glasses will gain ever more sophisticated capabilities.
Q: How are smartphones preparing us for the smart glasses era?
The most interesting smartphone on the market at present is Apple’s iPhone X. The phone takes a huge step in the direction of transitioning from smartphones to smart glasses.
The iPhone X comes out of the box with two telling Apple-created features: Animoji and Apple Clips 2.0. These are fun, apparently frivolous distractions for consumers. But for Apple, they represent the future of Apple smart glasses, to a certain degree.
Animoji are cartoon-character avatars that move and make facial expressions that mimic the users’ in real time. Apple Clips 2.0 enables users to change the background during selfie videos.
But viewed through the lens of the coming smart glasses revolution, what’s really going on with these "trivial apps"?
The purpose of smart glasses is to combine the real with the virtual. Both these apps do this. Animoji take the user’s real voice, head movements and facial expression, and applies them to a virtual self — a cartoon character.
Apple Clips 2.0 places the real person in a virtual background — or places a digitally modified real person into a virtual background. (Check out numerous uploaded example clips on Twitter.)
These apps use the special hardware in the iPhone X (which I talked about here and here) to accurately track the user — more specifically, finely measuring the distance of every point on the face from the phone in real time. The technology behind this feat is tiny and fits into the array of cameras and sensors in the "notch" of the iPhone X.
A future version of this technology will no doubt show up in Apple’s smart glasses, but facing outward away from the user in order to map the real world for real-time augmented or mixed reality applications.
Apple’s augmented reality on iPhone push is clearly a precursor to the main event: augmented reality on smart glasses.
The company is working hard to train a generation of augmented reality app developers on its ARKit and related developer tools and resources. (ARKit was the first product out of Apple’s T288 smart glasses group.)
Google is doing something similar with Project Tango, albeit with a fraction of the developers.
Q: And finally: Will smart glasses fog when I drink my morning coffee?
But the most important thing to know is that smart glasses are already here. They’ll improve to the point of becoming mainstream in both everyday life and in the enterprise. And the direction for smart glasses is already being set in leading edge smartphones like Apple’s iPhone X.