It wouldn’t be Christmas without tedious reruns, would it? And this week, it seems the rerun in question is “wouldn’t it be great if we all looked like cyborgs with transparent screens in front of our eyes?”
Never mind that people bold enough to wear smart glasses in the mid 2010s were christened “glassholes” for their troubles. Just six short years after Google Glass was discontinued for the public, a few companies are reportedly set to try the whole sorry experiment again, ready to create a whole new generation of social pariahs with more money than sense.
First up is Oppo which, along with its pop-out camera tech, debuted something called Oppo Air Glass at its Inno Day event. It’s a coffee-bean sized micro projector that’s capable of flashing up grayscale 2D images onto the lenses of your Oppo-designed frames. It’s coming early next year, though currently only to mainland China.
Then there’s also talk that Google itself is planning an ill-advised second go, proving that not everyone once bitten is twice shy. The report comes thanks to a job listing for a senior software engineer, with one chilling line: “As part of the team, you will be responsible for overall camera device software for an innovative AR device.” Not definitive, but potentially pointing in that direction again.
And then there’s Apple. While next year’s AR headset is likely to be more Microsoft HoloLens than Google Glass, the long-term aim is reportedly for AR glasses to replace the iPhone. Which will make it seem slightly less rude if people decide to check their emails while chatting to you, I suppose.
Perhaps it would be a bit unfair to demand a killer use case for products that don’t exist yet — and two which may never come true. But it does seem odd to me that the best I’ve heard anybody offer so far is to provide a kind of HUD for your eyes, so notifications can pop up in your peripheral vision: a whole new level of laziness for people who find lifting their smartwatch just a bit too onerous to be practical. Although those same people are going to be gutted when they find out how often AR glasses will need charging.
But in all seriousness, it’s a strange answer for an industry prone to hand wringing guilt about digital addiction. In the last few years, both Apple and Google have introduced tools for monitoring screen time, to try and wean people off being permanently tethered to their screens. I’m not sure physically tethering them to another is quite in the spirit of this intervention, even if it does technically make them consult their phones a little less.
There’s a real pleasure in being disconnected from your devices for a few hours. A screen break walk, assuming you block notifications to your step counting Fitbit for the duration, is a great way of clearing your head before returning to work. I’m not sure it’d have the same effect if you had a screen in front of your eyes, telling you the weather, where to walk next to avoid traffic and that you just got a Slack message from your boss.
But putting aside all of these quality-of-life concerns, there’s also a very real practical reason why attaching another screen to our peripheral vision is a bad idea: humans are very easily distracted, and it would be surprising if this didn’t end up killing someone.
Unclear but present danger
I appreciate that sounds alarmist, but let me show my working. A couple of years ago, I interviewed Dr Gustav Kuhn, who has the dual distinction of being both a reader in psychology at Goldsmiths University and a practicing magician.
His main area of interest is how magic tricks fool the brain, and while there’s a lot of different techniques involved, it essentially boils down to taking advantage of our sensory shortcuts and the fact that we’re nowhere near as observant as we think we are. As demonstrated in this experiment where 50% of participants didn’t notice when the person they were talking to changed halfway through a conversation:
Over the course of a talk I attended, Kuhn proved that it was possible for humans to see things that aren’t there, and to not see things which are when their attention is focused elsewhere. If you fancy testing yourself, take a break from reading and follow the steps in the short video below:
The reason I arranged to speak to Kuhn was a specific part of his talk on the science of magic, where he said the following: “Is it a good idea to develop human interfaces that allow you to present information onto glasses while you’re interacting with the world? No, it really isn’t.
“It might make intuitive sense because people can keep an eye on the task, but doing so will distract them and they simply won’t be able to see it. Most importantly, they’re not aware that they won’t be able to see it.”
So concerned was Kuhn about this, that he actually delivered a version of his talk to Google staffers. You can watch it here, if you like, but the point is neatly demonstrated by the trick finishing at 03:52, which I still can’t work out despite intently and repeatedly watching the spot where something must be happening.
“I was interested because Google Glass has been a bit of a bugbear of mine, because it’s just such a bad idea,” Kuhn told me later. “It’s so terrible, it’s really terrifying and I was quite keen to go and give a talk to Google to just highlight some of these limitations.” They were, it turns out, “as oblivious to these limitations as the general public.”
That, for me, is a good enough reason to never want AR glasses on my face, but thankfully the general look may be a far better deterrent for the fashion conscious.
That is, of course, unless Apple manages to get people to override their instincts of what looks normal. If a company can make people voluntarily stick AirPods in their ears, then getting them to wear novelty glasses should be a piece of cake.