Blog: Augmented Reality Fieldview 2020 – Part I

Context

What is the current state of the human endeavour in digitally augmenting our world? Moreover, in considering where we are today, how have we arrived at this point, and where will our efforts lead? To understand the augmented reality (AR) ecosystem in 2020, we must rewind a decade and project half as far into the future. 

Before we dive into the state of AR in 2020, we must first have a mutual understanding of what AR is, and equally importantly, what it is not. Herein, we will define these terms as follows:

Virtual Reality (VR) : Using traditional displays and rendered environments to simulate other world

Augmented Reality (AR): Using transparent displays to overlay additional layers on the real world

Extended/Cross Reality (XR): Using a synthesis of either or both approachesIn this discussion, we will break the development of AR into three distinct epochs. Each of these phases of development are marked with distinct features and are additive over time. While there are a series of small and medium sized players that ebb and flow, throughout it all are a handful of whales that have directed the entire ecosystem in profound ways, the culmination of which we will begin to experience in the new decade. 

The three eras of AR growth and proliferation can be described comparably to the trajectory of an individual startup, only extrapolated to an industry scale. In the fledgeling stage of seeding (2010-2019), the incumbents surveyed the field, made their initial bets, and formulated long term strategies; simultaneously a handful of unknown and untested ventures arose of their own accord. Once the initial field was cemented, the second phase of development (2020-2022) could begin, wherein the existing players began to cross pollinate through the transfer of personnel, flow of intellectual property, and acquisition or merger of smaller entities. In the ultimate stage, we will finally reap the benefits of synthesizing our analog and digital worlds when these structures come to fruition (2023-2025); brought on by a final charge of rapid Pac-Manning of smaller corporations and commercial deployment to large parts of the domestic, and then global, populations.

Era I – Seeding 

Titans

In the earliest stages of seeding the augmented reality (AR) industry, the major existing technology companies slowly turned their attention to the possibility of a new form of computing. While most of the world was focused on the rise and promise of virtual reality (VR), a minority of experts were already laying the groundwork for decades-long struggle for dominance in the smartphone usurper, AR. Herein, we can look at the work Google, Apple, Microsoft, and Facebook were doing at the start of the twenty-teens to understand the market activity of today. 

The first glint of AR in the public eye came in early 2013 when Google unleashed Glass to its Glass Explorers cohort. While it wouldn’t be available to the public until 2014, this first group of users caused a major uproar as society came to grips with the notion of being recorded by people who have cameras constantly attached to their face. While the concerns seem quaint by today’s standards, and the smartphone reigns supreme as the video recorder of choice, this was still a clear inflection point wherein society stated firmly that mass adoption was still a long way off. Meanwhile, the actual display on the Google Glass occupied a markedly minuscule part of the human visual field and was by no means immersive. Immersion, as typically measured by field of view (FOV) and embellished by other extensions such as volumetric sound, is a central question in any AR experience and even today is one of the industry’s white whales. 

One key facet of AR that is important to pin down; the project of true AR is about augmenting the individual user just as much as it is about augmenting their environment. Because it cuts both ways, the sensors of an AR device are just as important as the display technology. This is particularly clear for the likes of Apple and Microsoft who have spent billions of dollars cultivating, developing, and acquiring the proper sensor suite. This is also what makes the first (2013) and second (2019) generation Google Glasses more like smart glasses rather than true augmented reality devices; they fail to close the loop by turning their attention back to the user and instead serve as only a display and some external facing sensors. 

Apple’s position on AR is the most nuanced and reserved of all the major players. While the start of the last decade was populated with rumors of Project Titan, Apple’s self driving car effort, which resulted in a 2013 bid to acquire Tesla, the meat of the internal AR efforts slipped under the tech press’s radar until fairly recently. As early as 2015, Apple’s internal XR R&D group, named the Technology Development Group (TDG), had already turned its attention to commercializing a product. Led by Mike Rockwell, the TDG grew to over 1000 engineers, and by late 2015 had produced two revolutionary prototypes, N301 and N421.

These two devices took entirely different forms, with N301 being a VR or XR device with ultra high resolution and speed, and N421 being a standalone AR glasses offering. The unfortunate pitfall of N301 was that it was so powerful, it required a separate wirelessly connected hub for computation. This ultimately led to a major critique by Jony Ive who “balked at the prospect of selling a headset that would require a separate, stationary device for full functionality.” At Ive’s behest, Apple scrapped the N301 in favor of the N421 model of standalone lightweight AR in a form factor similar to common reading or sun glasses. 

While Ive was concerned with the specific implementation of AR at Apple, Tim Cook has spoken about the technology in remarkably high level and clear cut terms as far back as 2016. In an interview on Good Morning America, Tim stated that between VR and AR, his “view is that augmented reality is the larger of the two, probably by far.” He even goes further and decries VR since it isolates, “encloses, and immerses” its users. On the flip side, he lauds AR since it “amplifies human performance, instead of isolating us.” For Tim, the next stage of computing  “should be a help for humanity, not an isolation kind of thing for humanity.”

In 2017, Apple made a significant move towards realizing its vision of true AR by purchasing the German eye tracking firm, Sensomotoric Instruments (SMI). SMI specialized in highly accurate, precise, reliable, and robust mobile eye tracking installation which made it a perfect fit for integration with the N421 project. SMI is also SyncThink’s current provider of eye tracking on our GearVR based system, which is what allows us to provide medical grade eye tracking, whereas most mobile eye tracking on the market today is consumer grade.

SyncThink’s EYE-SYNC system with eye tracking provided by SMI

The very same year, Apple took its most daring public step into the world of AR by releasing its ARKit at WWDC 2017. With functionality for the top of the line iPhone, the kit provided some cool tools and experiences. It also played off of the Pokemon Go mania that had clutched the world since the previous summer and riffed on the popularization of Snapchat’s filters (provided via their acquisition of Ukrainian AR startup Looksery for $150M in 2015). ARKit has been expanded in subsequent years and always informs the addition of new sensors on the top tier iPhone and iPad Pro. Importantly, ARKit is a sleeper. The genius move by Apple here has been threefold. First, by releasing their AR software development kit far before the release of the actual hardware, Apple has been able to orient developers in that direction, get them comfortable thinking in blends of analog and digital worlds, and has gotten them familiar with the language they will use for development. Simultaneously, they’ve gotten their consumer base familiar with the technology at a notional level and therefore have built in an appetite for their future product, something that Google didn’t do with its sudden release of Google Glass in 2013. Finally, it will be fairly simple for Apple to port AR experiences from the iPhone or iPad to their iGlasses, meaning that upon release there will be plenty of applications, thereby driving consumer’s willingness to jump headfirst into the deep end.

Concurrent to Google’s foray into smart glasses and Apple’s quiet internal development and software plays, Microsoft was deep in the weeds doing the hard work of producing a fully fledged AR system from scratch. In line with the notion that AR is just as much about sensors as displays, Microsoft’s Project Baraboo was spun out of its Kinect division at some point in 2010 with Alex Clippman at the helm. Microsoft worked quickly, and of all the tech giants, is the only one with a commercialized AR product on the market thanks to the release of Hololens in 2015. Not only that, but the Hololens line has just received a major update in the form of Hololens II, which was announced last year and released in February of 2020. This has put them leagues ahead in terms of real world deployments and the integration of real world feedback.

Hololens is not targeted at a consumer audience as it is fair to assume that Apple’s N421 will be. Rather, Hololens and Hololens II are meant for frontline workers in scenarios that require additional information, precision, or collaboration. Importantly, Hololens operates in a fully standalone capacity. It does not have to be connected to a powerful desktop or laptop computer as many VR headsets still require. It does not have an external power supply or computation unit. And it does not even have a controller, instead it relies on input from voice, hand tracking, gesture, and eye tracking. This makes it a formidable offering with very few compromises, including an extensive field of view which minimizes scene clipping.

In juxtaposition to Apple, Microsoft, and Google, early on it was clear that Facebook would be a major player in the XR space, as catalogued for the world by their 2014 acquisition of Oculus for $2B from Palmer Luckey. While at the time this move was the most aggressive public push into XR by any of the major players, it was a strictly VR play. Oculus itself had little to no IP that was purely applicable to AR, and instead had revolutionized the combination of lenses and classical displays needed to achieve smooth and non-vomit-inducing VR experiences. Since this acquisition, Facebook has continued to place most of its eggs in the VR basket, and any attention they have put into AR comes in the form of passthrough cameras on the front of an Oculus HMD, not in the more classical AR formfactor of transparent lenses made of a metamaterial. 

Mortals

While the tech giants on the west coast began to settle in for the long haul, across the country other efforts commenced that would bring into question the incomparable power of the incumbent players. In 2010 in the Florida Everglades, Rony Abovitz incorporated his comic book company, Magic Leap, with a dream of creating a richer world full of deep universes, the Magicverse.

2014 marked a significant turning point for the outsider play from Magic Leap. The company had previously raised $50M from investors, but in October of 2014, it announced the closing of its $540M Series B. The question of “who is arming the rebels” in the AR space is an interesting one. The general pool of investors was made up of top tier VC firms like A16Z and Kleiner Perkins, but there are also some noteworthy members of the cap table, specifically Vulcan Capital – Paul Allen’s VC firm – and Google. On the heels of the Google Glass launch, this second step forward marked a clear trajectory into the AR multiverse for the search giant. The Series B position Google took also came with a board observer seat for then SVP of Android, Chrome, and Apps, Sundar Pichai. 

Magic Leap’s funding history is a record setting one. Following the $540M Series B, the record for largest Series C was subsequently set, with a total investment of $790M from repeat sources like Google and Qualcomm, and new investors like Alibaba, T. Rowe Price, and Warner Media. Since the record setting Series C, Magic Leap has raised another half dozen rounds of investment for a staggering total of ~$3B.

More important than the funding is what it produces. All of this capital flowed into spinning up large scale metamaterial production that relied heavily on standards of high precision manufacturing utilized by chip makers. These metamaterials were then housed in a goggle-like enclosure wherein fiber optic threads fed laser signals from a hip-attached laser emitter, power, and computation pack. All of this was bundled together with a controller, and after nearly a decade of development, was released in the summer of 2018 as the Magic Leap Creator One Edition. Because of the inherent cost of the metamaterial and the associated hardware to run it, Magic Leap’s first product was priced at $2300 USD. This meant that it could afford to integrate much more robust sensors than anything on the VR market, who generally had to subscribe to sub $1000 USD price points to satisfy consumers. While this has meant very strong point cloud mapping of the environment and solid eye tracking offerings, unfortunately the public was not nearly as willing to shell out for a $2300 HMD as the company had hoped. With projections of selling 100,000+ headsets in the first few years, by the end of 2019, Magic Leap shipped around 6,000 Magic Leap Creator One Editions.

Magic Leap has a secret sauce that no other AR company has yet been able to match, the depth of the dream. Magic Leap isn’t just about creating beautiful hardware, but also is intimately involved in crafting the universe that hardware provides access to. This can be seen from some of the earliest moves that company took, such as partnering with Weta Workshops on Dr. Grordbort’s Invaders or the NBA for streaming of their games through the headsets. Indeed, the partnership with AT&T and the focus on 5G brings to the forefront the need for high bandwidth data transfer. Finally, the clear medical attention given to the device has been incredibly strong, with a burgeoning ecosystem of creators like SyncThink, BrainLab, and XR Health at the forefront of brain health assessment, augmented neurosurgery, and patient health respectively.

Outside of Magic Leap are an ever growing smattering of smaller AR companies which often flicker in and out of existence like the holograms they create. Further, the term AR has been loosened and adopted by technologies that are not true offerings in the space. Many smartglasses companies now claim to be in the AR space, but in reality are more comparable to a heads up display (HUD) than anything augmented or immersive. 

Not to be confused with North (formerly Thalmic Labs), Project North Star is a nascent movement more than it is a corporation. Founded in 2018 by Leap Motion (now Ultraleap), the creators of the hand tracking hardware and software by the same name, Project North Star offers creators the ability to build their own AR HMD with the help of standardized designs, community collaboration, open source GitHub software repositories, and build guides. The notion here is to “accelerate experimentation and discussion around what augmented reality can be” by “open sourcing the design and putting it into the hands of the hacker community.” The price point to build a North Star HMD is very low, as the project relies on “off-the-shelf components and 3D-printed parts.” This has shattered the centralized access into the world of AR by allowing any individual to access the metaverse using what they have at their disposal and without direct reliance on a corporation to provide the keys and gateway. 

The proliferation of ideas, approaches, hardware, and software that has taken place over the last ten years brings us to a clear inflection point. Today the AR field has achieved such a size that there are traceable flows of populations, intellectual property, and corporations across the various enterprises and efforts. 2020 is the year of the eye for many reasons, not the least of which is the inception of the second era of AR, that of cross pollination.