Consensual reality

The data model of augmented reality is likely to be a series of layers, some of which we consent to share with others.

augmented_reality_data_layers2

A couple of days ago, I had a walking meeting with Frederic Guarino to discuss virtual and augmented reality, and how it might change the entertainment industry.

At one point, we started discussing interfaces — would people bring their own headsets to a public performance? Would retinal projection or heads-up displays win?

One of the things we discussed was projections and holograms. Lighting the physical world with projected content is the easiest way to create an interactive, augmented experience: there’s no gear to wear, for starters. But will it work?

This stuff has been on my mind a lot lately. I’m headed to Augmented World Expo this week, and had a chance to interview Ori Inbar, the founder of the event, in preparation.

Among other things we discussed what Inbar calls his three rules for augmented reality design:

  • The content you see has to emerge from the real world and relate to it.
  • Should not distract you from the real world; must add to it.
  • Don’t use it when you don’t need it. If a film is better on the TV watch the TV.

To understand the potential of augmented reality more fully, we need to look at the notion of consensual realities.

We’re on the cusp of an era in which each of us perceives the world around us differently because of technology. One might argue that we’re already there — even with friends, half our thoughts are in our smartphones, in chat, in maps, on Facebook. But it’s going to get much more obvious when we start augmenting our senses.

Imagine that, during our walking meeting, Guarino and I had been wearing augmented reality devices that projected heads-up displays into our eyes. As we walked, we’d connect with at least four kinds of information:

  • Personal, private data (such as a reminder to call a loved one.)
  • Shared data (such as notes and hyperlinks about what we’d discussed on our walk.)
  • Public opt-in data (such as an advertisement from a liquor store as we walked past.)
  • Public, unavoidable data (such as a red warning when accidentally crossing the street into oncoming traffic.)

Google_Maps_annotated_mapEach of these is a set of contextual information layered atop our perception. Even when sharing the same layer (such as a map) with someone else, there will be significant variances. Today, for example, Google scrapes your inbox for flight and hotel reservations, then displays this in your calendars and maps — so my version of a map layer might have annotations about a hotel on it.

Personal data will require tremendous context. The best personal agent won’t just tell remind me I need batteries when I’m at the hardware store; it will also know when not to interrupt me because I’m concentrating or focused. Some of this data will come from paid software — your colleague may not be able to afford the experience you’re having.

Shared data means sharing applications, and handling permissions. Collaboration tools will be a hotbed of innovation in AR software, but issues like version control and attaching content to physical locations aren’t well-resolved yet.

Public opt-in data will face governance and regulation. Alcohol ads shouldn’t be shown to children who pass by; the filters for opting out based on age, gender, religion, and so on suggest that this will follow an opt-in model rather than an opt-out one, but either way, AR spam will be a real problem.

Finally, there will be some data that’s unavoidable. Your mobile device has to support 911 calls regardless of phone plan; emergency warning systems like Amber Alerts can push messages to your smartphone’s screen whether you like it or not. Data that’s in the public interest is one thing, but parents and guardians may impose oversight software on their charges. Imagine what happens when a headset warns you against binge drinking.

Navigating a world where everyone else has a slightly different view of reality will be jarring, too: one person wearing lie detector software, another exhibiting perfect social recall.

Of course, being in a space is a form of consent, and for those environments, holograms and projections work well. But the data model of augmented reality is likely to be a series of layers, some of which we consent to share, temporarily, with others.

Maybe the distraction of our handsets is just training us for such a world.

Public domain image on article and category pages via Pixabay.

tags: , ,