Smell and taste: New frontiers for experience design

Designers need not start from scratch as they wrestle with orchestrating experiences that span digital and physical.

MACS_J07175+3745_Judy_Schmidt_Flickr

Download a free copy of Designing for the Internet of Things, a curated collection of chapters from the O’Reilly Design library. This post is an excerpt from Understanding Industrial Design, by Simon King and Kuen Chang, one of the books included in the curated collection.

Two of our richest senses, smell and taste, are not often associated with design. However, the creation of objects that support these senses is an ancient practice, embodied best by the tea set, where rituals of assembly and service lead to hints of the aroma. Holding the tea cup warms your hand without burning it, and the slow sipping of the tea forms a communal bond with other participants. Outside of classic and common serving items, designers today are increasingly finding new ways to collaborate with chefs and food companies to design with smell and taste in mind, forging a new frontier for sensorial design.

Martin Kastner is the founder and principal of Crucial Detail, a studio in Chicago that specializes in custom pieces to support unique culinary experiences. Martin is best known for his work designing serviceware concepts for Alinea, the 3-star Michelin restaurant founded by chef Grant Achatz. That collaboration has extended to other restaurants owned by Achatz, including The Aviary, a cocktail bar that prides itself on serving drinks with the same level of attention as a fine dinner.

At The Aviary, one of the most popular creations by Crucial Detail is the Porthole Infuser, a round vessel that presents the ingredients of a patron’s cocktail between two flat panes of glass, emphasizing the transformative action of the steeping process and building anticipation for the cocktail’s taste. The Porthole Infuser takes a part of the preparation process that is normally hidden and brings it directly to the person’s table, providing time for the drinker to contemplate the ingredients on display, creating a mental checklist for their tongue to seek out when they take their first sip.

The popularity of the Porthole Infuser at the Aviary led Kastner to create a Kickstarter campaign to fund the additional design and manufacturing required to release it as a commercial product. Support for the project was dramatic, achieving 25 times more funding than originally asked. This backing set the course for a redesign that allowed the infuser to be manufactured at scale and sold for $100, down from the several hundred dollars that each custom constructed version for The Aviary cost.

The Porthole Infuser is marketed as more than a cocktail tool, working equally well to support the smell and taste of oils, teas, or any other infusion recipe. It’s an example of how designers can enhance the dining experience, not by crafting the smell or taste of the food itself, but working in collaboration with a chef to heighten our awareness of those senses.

Much of what we eat today comes in a package, rectangular boxes that homogenize our food into the same shapes and textures without regard to their smell or taste. Japanese designer Naoto Fukasawa explored how food packaging could more fully engage our senses in his “Haptic” Juice Skin submission to the Takeo Paper Show 2014.

Fukasawa created various juice boxes, each with a covering and structure that invokes the skin of the relevant fruit. The banana milk package has the rubbery texture of a real banana skin, along with faceted edges and the ubiquitous oval sticker on the side. The strawberry juice box is square in shape, but richly textured using real seeds. The kiwifruit juice box, as you might expect, is brown and fuzzy to emulate the unique feeling of that fruit’s natural shell.

In simulating the color and texture of the fruit’s skin, Fukasawa hoped to reproduce the feeling of the real skin, invoking a more holistic sensory moment as the juice was consumed. Although designed as a concept for an exhibition, the banana packaging was actually produced commercially for a limited time by the TaKaRa company. The production run looked quite similar to the exhibition version, but unfortunately without the simulated texture.

How might interaction designers support smell and taste? This is truly a new and under-explored territory, but there are signs of interest and one-off experiments happening that point toward a potential role. One of the most engaging speakers at the IxDA Interaction 2014 conference in Amsterdam was Bernard Lahousse, who gave a talk entitled “Food = Interaction.” Lahousse, who has a bio-engineering background, works at the intersection of food and science to truly design for taste itself. He founded the The Foodpairing Company, which provides an online tool and API for chefs, mixologists, and foodies to explore and be inspired by potential food combinations through a science-based recommendation engine.

In Lahousse’s presentation at IxDA Interaction, he shared how it’s not only the flavor pairings themselves that contribute to the smell and taste, but that the environment and manner in which we eat can have a dramatic effect. The design of packaging and utensils is one part of this, but he also gave examples of chefs who are creating interactive, even game-like, eating experiences. One restaurant he highlighted uses room temperature, sound, and projections to design an environment that alters and enhances the smell and taste of the food. These augmented dining environments are one area in which interaction designers could contribute their expertise to support the full range of human senses.

An orchestration of the senses

Interaction designers have always tried to engage people’s senses, but in comparison to the tangible output of industrial design, the options to do so have historically been limited. When designing for the screen, the best option has often been simulation, using metaphor and connotation to invoke a sensorial experience beyond what can truly be offered.

The introduction of the graphical user interface (GUI) was the first major advancement in engaging the senses through a screen. The next leap forward was the “multimedia” era, bringing sound, motion, and interactivity together in unique and immersive environments. Multimedia was initially made possible through cheap CD-ROM storage, which offered access to large graphics and video files that were impractical to store on small hard drives or download over slow Internet connections.

Interaction designers of the multimedia era often utilized the new capabilities of CD-ROMs to break away from standard interface conventions and mimic as many sensorial, real-world elements as possible. Map interfaces looked like faded and stained treasure maps, deep drop-shadows created virtual depth, and richly textured environments launched users into immersive 3D worlds. This was a time of widely variable interface experimentation, as designers combined text, graphics, audio, video, and animation in unique ways to make encyclopedias, video games, and educational programs that simply weren’t practical before CD-ROMs.

The invocation of physical materials and properties also found its way into standard programs and operating systems. Apple first introduced a brushed metal interface style with Quicktime 4.0, which later became a dominant feature of their OS X operating system. By 2004, Apple had canonized the brushed metal in their Human Interface Guidelines (HIG), encouraging designers to use the visual treatment if their program “strives to recreate a familiar physical device — Calculator or DVD player, for example.” This visual reference to a physical material was less sensorial than metaphorical, acting as a bridge to ostensibly enhance usability and understanding as behaviors transitioned from physical to digital devices. This was the same rationale employed for the early versions of Apple’s iOS and, over time, both operating systems evolved to use simpler UI styles once users became familiar with the platforms.

Referencing physical materials through a visual treatment obviously cannot engage our senses in the same way as their physical counterparts. Graphics that look like leather, felt, steel, or linen are often little more than interface decoration. The sensorial limitations of these graphic treatments highlight the distinction between interface and interaction design. Static pixels on a screen can only engage us visually, and in most instances should avoid invoking additional senses they can’t deliver on. But interaction design goes beyond the interface to encompass all the moments of interaction that a person has with a system over time.

As physical products become increasingly integrated with digital systems, interaction designers should avoid defaulting to a screen for everything.

This is why interaction designers tend to think of their work in terms of “flows,” focusing equally or more on the connections between states, the various inputs and outputs that are possible at that moment. This focus on the in-between makes time itself a kind of design material. It is not so much that interaction designers are manipulating a user’s sense of time, though sometimes elements like progress bars do try to ease waiting, but that they are using this fourth dimension as a connective platform to combine information, choices, and responses. Time is a kind of stage from which to orchestrate sensorial engagement into a set of dynamic movements.

On a computer, or mobile device, this orchestration of interaction possibilities and system feedback can utilize animation, translucency, figure/ground relationships, color, sound, and standardized notifications to facilitate engagement with the system over time. But how does this work when we move beyond the screen? When a physical product is embedded with computation and network connectivity, it transforms from an object to a system. A traditional product has discrete and predictable interactions that take place within a defined session, but once it becomes a system, the sequence of interactions are less predictable and take place over a longer period of time.

Consider the previously discussed Beosystem 2500, where the opening and closing of the stereo’s doors represents three clear states to form a beginning, middle, and end to the experience. Compare that to the range of possible states and behaviors that a connected, computationally controlled stereo might have. Beyond reacting to your raised hand, it could detect your presence in the room as a specific individual. It could respond to your gestures or voice, highlight or hide relevant modes based on nearby media or subscription status, allow for use of remote speakers, adapt the volume based on time of day, offer you new music by your favorite bands, start music playing just before you enter the house — it is almost limitless to consider the possibilities.

How should this hypothetical stereo enable and allow for this expanded set of interaction possibilities? One approach is to put the majority of interactions on a screen, a tablet on a stand in the living room. However, as David Rose of the MIT Media Lab refers to it, the next era of computing is more likely to be full of “enchanted objects,” where interactions with our products and environment are more natural, physical, and less reliant on a glowing rectangle to control everything.

As physical products become increasingly integrated with digital systems, interaction designers should avoid defaulting to a screen for everything. Computational sensors can be used as richer and more natural inputs, detecting and making inferences from changes in light, temperature, motion, location, proximity, and touch. Output can move beyond a screen with voice feedback, haptic actuators, light arrays, and projection.

In utilizing this mix of inputs and outputs, screen-based interaction patterns should not always be translated directly into the physical environment. Getting a notification on your phone might be unobtrusive, but having it spoken aloud in your living room might be less desirable. In the same way, there is a danger in assuming that a gesture or sensor-based input is necessarily more natural. If a user needs to develop a new mental model of how a product “sees” them, or detects their presence, then the illusion has broken down. An example of this can be found in many airport or hotel bathrooms, where people wave their hands in frustration near unfamiliar sink fixtures in an attempt to discover how the sensor is triggered.

The technology may be new, but designers need not start from scratch as they wrestle with orchestrating good experiences that span digital and physical. As more complex behaviors move off the screen, interaction designers should augment their knowledge of digital systems with more than a century’s worth of industrial design lessons on how to engage the full range of human senses.

Cropped image on article and category pages by Judy Schmidt on Flickr, used under a Creative Commons license.

tags: , , , , , ,