User interfaces for AR

People intuitively understand that Augmented Reality (AR) opens the door to compelling new ways to interact with technology and our environment. Yet the AR implemented on mobile phones today (e.g., touch on a point of interest on the live video which is on the phone’s screen and get more information) is only the tip of the iceberg. It’s dangerous to predict too far into the future, and there is a risk that talk about haptic (touch) interfaces and heads-up displays for AR will seem like just hype. But new ways of interacting with digital data on the real world are not hype to those who work on them.

Researchers at the Max Planck Institute for Biological Cybernetics at Tuebingen and the Swiss Federal Polytechnical Institute in Zurich (ETHZ) are members of a worldwide community conducting studies with users to evaluate how touch and sight work together in multimodal interfaces for AR applications. These researchers are making great strides in the direction of “next generation” user interfaces, inventing devices that take advantage of increasingly powerful and sensitive sensors (one and then two cameras, Assisted GPS, 3D magnetometer, or 3D accelerometer).

Touch

In most consumer mobile AR applications released to date, AR interaction is mediated via haptics–the user touches a highlighted area of the screen to request to more information associated with an object or point on the planet.

But we perceive our environment using a combination of all senses, and researchers are developing haptic interfaces that go beyond the “touch for more info” model. Dr. Matthias Harders of the ETHZ Computer Vision Lab works on applications of AR for training surgeons. At the upcoming ISMAR 09 meeting, Drs. Benjamin Knörlein (ETHZ), Massimiliano Di Luca (Max Planck), and Harders will present the results of their ongoing research on the use of haptics and AR interfaces. Their studies show that delays for haptic feedback result in decreases in the user’s perception of “stiffness” in the interface. In contrast, visual delays (where there is a delay between the touch and the response) caused an increase in perceived stiffness. Understanding how vision and touch interact to affect the user’s perception can help AR developers finetune the interface so that it accurately maps that perception.

Such studies may, one day, be helpful for designing AR training for neurosurgeons so, please, pay attention!

Show Me

Today, the information associated with a Point of Interest (POI) in a consumer AR application is frequently presented on the small screen as either text in a box or bubble, or arrows for navigation. These are easy to understand and “good enough,” given the low level of accuracy and speed in today’s sensors. Their results are fine for a tourist trying to locate a subway stop, but would not be suitable for a utility crew registering the precise position of an underground electrical cable before digging.

One drawback of overlaying text or diagrams on a live video showing on a mobile phone screen is that the user must hold the device in the correct position, pointing at the target. Head-Mounted Displays (HMDs) offer a hands-free, heads-up alternative. For at least a decade, researchers in dozens of laboratories and companies have worked on the development of lighter and more reliable HMD technologies. Many research systems, systems designed for professional training applications, and military applications (e.g., night vision goggles) use HMDs.

While popular consumer applications which use HMDs are likely to be more than three years in the future, the cost is falling, performance is rising, and more applications for HMDs exist today than many realize. For example, the Academy of Art and Design at the University of Applied Sciences Northwestern Switzerland has developed a system called LifeClipper2 which uses a heads-up display that allows the user to experience the changes proposed for urban renewal or development projects. LifeClipper can help city officials and other stakeholders more fully understand the impact of those projects before they’re actually built.

Many mobile phone AR applications require the user to hold up a mobile handset for several minutes, and that’s tiring. But avoiding that fatigue isn’t the only compelling reason to explore the future of HMDs. [Note: Thanks to V-VM of Nokia Research Center for reminding me that there may be camera phones designed with the lens and screen positioned in such a way that the user would not need to hold up their arm to view the world at eye level. But then the user would be looking down, which might also be awkward when walking.]

HMDs have three other noteworthy advantages. First, in bright light conditions when a mobile-phone screen may be difficult to see, these displays remain very visible for the user. The heads-up display can even become the user’s sunglasses. Don’t you think I look chic in these shades?

nokia-gaze-tracking.jpg
Christine Perey wearing last year’s model of the Nokia gaze tracking system [photo credit: Toni Järvenpää]

Second, when the display is resting on the user’s ears and nose, as eyewear, their hands are free to do something else. If, however, the display is near the eye, the question of how the user selects the point of interest becomes more problematic. Scientists at the Nokia Research Center in Finland have been working on using eye gaze to point. Gaze detection and head position tracking for directing user interactions are the purpose of the device worn in the photo to the left.

Finally, although the field of view is still much less wide than that required for many AR and VR applications, an HMD can provide a more “immersive” user experience than a small handheld screen at arms length. Frequently, HMDs also have integrated ear pieces to augment the visual with sound. Using sounds (for example, 3D audio) and synthesized or natural speech for presenting information to the user is yet another area of exploration which is sure to be underway in research labs around the world.

To appreciate the full potential of new user interaction paradigms being developed for and studied with AR, a test drive is valuable. At ISMAR 09 in Orlando October 20-22, 2009, Vuzix, a manufacturer of a wide range of HMDs will be among several in the business exhibiting their latest models. As part of the research demonstrations, Nokia Research will be showing the current prototype of its gaze tracking, using hardware which is much lighter than that which was used a year earlier (shown above). A YouTube video showing how the new model might work in the future can be watched here. Microvision, another manufacturer of commercial HMDs and pico-projectors, will be participating in one of the ISMAR Workshops on October 19, 2009.

tags: , ,