Visions of the Future: Virtual vs Augmented Reality

 

future-vr-ar-2-resized

Batman vs Superman. Coke vs Pepsi. Red Sox vs Yankees. Boxers vs briefs. It’s no surprise that we love a good rivalry dear reader. As long as it remains friendly, a good brou-ha-ha stirs up the impassioned devotees on both sides and provides bloggers with an endless parade of what-if scenarios and listicles with which they can assail their readers.

There’s a fight brewing within the technology sector that makes us old-timers misty eyed for the heady days of VHS vs BetaMax. A time when pants were parachutes, hair was big and floppy disks actually were floppy. Good times they were.

The fight I’m referring to is between virtual and augmented reality. In the next five years we’re likely to hear little else spewing from the mouths of the propeller heads and boffins of the technological cognoscenti. They’re both powerful technologies with a potentially huge range of applications, but, like the Highlander, there can be only one. Indeed, I’ve read the tea leaves, rolled the bones and consulted with the Magic 8-Ball and I’m prepared to share my not-so-expert prognostication exclusively with you, my dear readers. But first, as they say in boxing, let’s take a look at the tale of the tape.

Virtual Reality

In the blue corner, we have virtual reality or VR. An older contender to be sure, at least in terms of its concept, VR replicates environments, simulating a physical presence in the real or even an imagined world, allowing interaction through artificially created sensory experiences, including sight, touch, hearing, and smell.

The concept itself can be traced all the way back to 1938, when Antonin Artaud described the illusory nature of characters and objects in the theatre as “la réalité virtuelle” in a collection of essays, Le Théâtre et son double. While science fiction has played with the idea since the 1950s, it wasn’t until 1968 when Ivan Sutherland, with the help of his student Bob Sproull, created what is widely considered to be the first virtual reality head-mounted display system. The primitive head piece worn by the user was so heavy that it had to be suspended from the ceiling, inspiring its name, ‘The Sword of Damocles.’ The current crop of headsets, most notably the Oculus Rift, represents the cutting edge of consumer-ready VR technology. With the huge gains in graphics and computing power over the past few years, VR is poised to be a powerful technology, but for what exactly?

Augmented Reality

In the red corner comes the sprightly form of augmented reality or AR. Instead of replicating an environment, AR takes the current real-world environment and enhances it with interactive sensory input such as sound, video, graphics or GPS data.

AR is a newer concept interms of its technology and as such, lacks VR’s pedigree. In many ways it’s an offshoot of VR, taking a less all-in approach. Imagine looking through a pair of goggles at your couch and seeing an animated character sitting there and you have some idea of how augmented reality works. One of the most intriguing products taking advantage of AR is the HoloLens, an exciting, if a little out-of-left-field product that seems to be part of a concerted effort by Microsoft to get its mojo back. Like VR, this is powerful technology, but will its lighter touch and more grounded interface give it the traction it needs to survive?

There Can Be Only One (Sort Of)

Unlike the format wars of the past, I don’t honestly think there will be a single winner, with the loser fading into ignominy. I think the choice of VR vs AR will, as it does with so much of technology, come down to the use case.

Virtual reality seems to be a gift-wrapped present for the gaming and entertainment industries. The idea of an immersive world (which VR certainly is) has been a fever dream of the industry for as long as there’s been an industry. Being able to control every single aspect of the environment will offer nearly infinite options to game creators, filmmakers and visual effects artists. Although it has been touted as an amazing learning and educational device, that side of the technology hasn’t been embraced as much as many would have hoped by now. The fact that it IS an entirely immersive experience means that it’s a very individualistic technology. Not only can no one else see what you’re seeing, but you cannot really interact with the real-world environment while wearing the headset.

Augmented reality is also an incredible gift for the gaming community – in fact, HoloLens was apparently only supposed to be a gaming device until Microsoft’s incoming CEO Satya Nadella encouraged the team to expand it to a user interface system – it does so in an entirely different way. Imagine a game that uses your real environment (the headset is mostly transparent), with characters hiding behind your bookcase, or diving through doorways. The mind reels at the possibilities, especially as the technology improves. The real selling point for AR however is in the educational realm. One of the examples Microsoft used in their demo video showed someone fixing a bathroom sink wearing the HoloLens. As he looked through the goggles at the pipe, an instructional video was playing in a small screen to one side of his field of view and an arrow suddenly appeared showing him which way the wrench needed to be turned to tighten the joint. It’s like annotating the real world. The possibilities for enhancing education and training are simply staggering.

My Not-So-Expert Prognostication

If I had to put my money on one technology to be the one that gains broad public acceptance, I’d have to go with augmented reality. With apologies to Mark Zuckerberg, I don’t see the benefit of everyone tied into an opaque headset, totally oblivious to the world around them and unable to move without ‘interacting’ with the environment in a way that could involve bruises. Mobile phones and texting have already created that scenario, and that’s bad enough. While you’ll never walk around town wearing an AR headset, the ability to access and interact with information in a way that doesn’t cut you off from the rest of the world is definitely the more practical of the two technologies.

For me personally, VR is an entertainment device that reminds me too much of 3D, a technology that still doesn’t really work as well as it should and as such remains a novelty. AR feels more like a tool – one used in very specific circumstances to be sure – but a valuable and practical tool for education and yes, even entertainment.

Which one will be dominant? Only the future knows for sure. In the meantime, place your bets…

A Eulogy for IE7

Friends, we are gathered here to pay respects to our honoured dead. Allow me to say a few words about our dear, departed Internet Explorer 7… The year was 2007. We were still reeling from the twin bombshells of Apple’s introduction of the iPhone and the news that Bob Barker was leaving The Price is Right. The televisions of the world were abuzz with the news that the Spice Girls were planning a reunion. Sadly, no one was watching television because they were busy tweeting about a Facebook status announcing Bob Barker’s departure on their iPhones. It was during this social tempest that Microsoft rolled out Internet Explorer 7.0, the first major update to the internet browser in nearly five years. Bundled with the infant (and some would say deeply flawed) Windows Vista, it quickly became a major player in the burgeoning World Wide Web. By late 2008, it had reached a worldwide market share of nearly 47%, an impressive number for any browser. Despite this, IE7 suffered from a distinct lack of support for emerging web standards – an Achilles’ heel that would continue even into later versions. (If we’re being honest, pairing it with Vista didn’t help.)

Fast forward to 2012. Digital firms and developers across the world begin dropping support for the venerable IE7. As happens anytime a piece of software shuffles off this digital coil to stream to the great server stack in the sky, the friends and family are left struggling to answer the question: why?

Web browsers are every users’ portal into the Internet. People would be supremely bored staring into an endless amount of code without a browser, unless you could see into the Matrix like Neo. (And then you’d rock a very cool coat.) Web browsers – including Internet Explorer, Safari, Firefox, Google Chrome, and more – are an essential part of today’s digital visual experience. While every browser is unique, a universal truth is that updating these pieces of software to the latest versions will guarantee a better viewing and navigation experience on your website.

Internet Explorer 7.0 is a six year old browser, and in that time, many new web standards have been implemented (and a few abandoned) to improve web users’ experience and safety online – standards that IE7 simply does not support. This means that websites have to be designed to comply with modern standards AND with IE7’s out-dated code. This creates much more work for web developers – which translates into increased development costs. (And very cranky developers.) Add to this the bombshell that as of August 1, 2011, Google announced that it would no longer support IE7. I’ll say that again. Google no longer supports IE7. Take a moment to think about the implications of that statement. It’s ok – I’ll wait.

In a prepared statement, Google was quoted as saying:

“As the world moves more to the web, these new browsers are more than just a modern convenience, they are a necessity for what the future holds.”

With a global search engine market share of 80%, these are words to be heeded.

And so you see friends, modern browsers are an essential tool in today’s online world, completely free, and updated more often than Lady Gaga’s Twitter feed. While IE7 was useful in his day, that day has passed, and we must soldier on. It’s what he would have wanted.

“Ashes to ashes, dust to dust, they’re on version 10 now, update it we must.”

Fanboys: Does Anybody Care?

Picture the scene: You’re in an electronic store with a friend, just shopping around and checking out the newest toys. Suddenly, a specific piece of digital bling catches your eye, so you sidle over to take a look. You try it out carefully, weighing the price against its usefulness, trying to decide how it fits your lifestyle or workflow, and you nod thoughtfully as you come to the conclusion that this may be the best choice. And then you hear it: “You aren’t gonna buy that are you?”

You turn to find your friend staring at you with surprise. At your questioning look, he explains patiently that consumer surveys have indicated that XYZ Corp. has had difficulty with customer service, and that its devices are not as cost effective as other comparable brands. He also explains that the ABC Corp. device he is holding is much more suited to your particular workflow and budget. You listen carefully, then smile and thank him for the help, placing the XYZ device back on the shelf and reaching for the ABC Corp. device as an infinite chorus of heavenly angels sing in a beam of unearthly light.

Suffice it to say that if you believe this scenario, you are a fool. (Particularly the last bit) You and I both know that the scenario ACTUALLY goes something like this:

You turn to see your friend eyeing the XYZ Corp. widget you hold as if it were the severed head of a succubus. At your questioning look, your friend rolls his eyes and proceeds to lecture you on how XYZ Corp. makes terrible, kitten-destroying devices that are responsible for every catastrophe since the beginning of time, how all its users are illiterate, barely human Jerry Springer rejects, and how the CEO is actually the illegitimate love child of the demon Cthulhu and Jeffrey Dahmer. Exasperated at being lectured, you fire back with detailed explanations of why the people of XYZ Corp. make Ghandi look like a terrorist, and how in one study, an XYZ Corp. device actually cured a woman in Illinois of cancer. A dead woman. In 1924. Your friend responds with…

And so on. Everyday we see these ridiculous arguments between tech enthusiasts.

“Apple is amazing!” “Windows is designed for stupid people!” “Apple Macs are tools of the devil!” “Android is an operating system for toys!” “Windows is OBVIOUSLY a superior operating system!”

I’ve truly never understood this brand loyalty people have. They act as if once they purchase a computer or other electronic device from a specific manufacturer, they are locked into that brand for the rest of their natural lives and they must go out and preach to the unwashed masses to save their very souls. These people are labelled ‘fanboys’, sparking even more arguments and trash talking, such that the majority of average folks find the whole thing rather silly.

I am a tech enthusiast. I love the stuff and make no apology for it. However (and this is a key point) I hold no loyalty to any single technology maker. I use whatever device fits into my workflow best. When I end up being lectured by some twit with a massive (micro)chip on his shoulder, I have the same response each time:

“I don’t care what you use.”

If you love your Apple Macbook Pro, or your Windows ultrabook, or your Android phone, then fine! Wonderful! I don’t even mind hearing about how great it is, because I like to see people happy with their technology choices. What I object to is being lectured because I think Windows is too unreliable for my workflow, or because I believe that the Android OS is not a robust enough for my purposes. I am not saying these products are bad, just that they are not right for me. That last phrase is often conveniently overlooked by fanboys during their sermons.

I can say one thing. The term is definitely appropriate. Fanboys. This is the behaviour of a child, except that a child is excused because they have not yet learned the social graces. The adults who engage in this behaviour simply look ridiculous, and never truly convince anyone.

So the next time you are in an electronics store, look around. You’ll see several arguments going on, and you’ll see me. I’ll be the one walking away shaking my head, Macbook Pro in hand.