The Past and Future of Experience Design

Written by: Nathan Shedroff

Ten years ago, when I wrote The Making of a Discipline: The Making of a Title, 2002, there was a big debate on: Is experience design about online and mobile interfaces or is it something more? Forward-thinking initiatives, like the AIGA’s Advance for Design, began the conversation at the center of the convergence of the media, technology, and business worlds. Started by Clement Mok and Terry Swack, and supported by Ric Grefe, this group of people met periodically for several years to talk about the changes in the above industries and how to both manage and communicate them. (See: AIGA Experience Design – Past, Present and Future ) Even then, the term “experience design” was controversial and, while it became the name of the professional group that evolved out of this effort–AIGA Experience Design, the term was dangerously close to being limited to designing digital products such as websites and mobile applications.

There has been a reluctance for designers to embrace the idea of experience and I’m not sure why. Every single person involved with the Advance for Design and AIGA ED was someone who sought-out and appreciated experiences in his or her life, whether in theater and entertainment, quality customer service, or any type of real life event. Yet, many didn’t feel comfortable taking on the idea that we were creating total experiences in a professional context (as opposed to digital interfaces and media only). I remember near knock-down, drag-out fights online and in person over whether experiences like great meals, spectacular events like Cirque du Soleil, or retail experiments like Target’s pop-up shops could teach interaction and visual designers lessons in making better experiences (and whether these physical-world experiences, too, belonged under the umbrella of experience design).

To tell the truth, this desire to limit experience design to the digital world always puzzled me, especially given the rapid rise of experience dominating the branding profession (resulting in the, now ubiquitous, term brand experience) and the retail and hospitality industries (today, we call this service design). Brand professionals woke up to the fact that branding was more than the application of a corporate or brand identity. Before interactive media reminded them that brands had been interactive all along, most of the work in brand strategy, design, and management was focused on identity standards and packaging design. Interactive media forced the conversation that reinvigorated this entire profession (in addition to all media and business) and recast many of these professionals as visionaries and strategists, when before they were mostly regarded as “design Nazis.”

It just seemed ludicrous that experience could only live in the narrow world of digital media when it was already so vibrant in all other media.

There was another debate at the time, as well (and maybe this describes the reluctance to embrace experience design I referred to earlier), one that seemed even more ridiculous: Can you design experiences for people? Many in the community argued that there was no way that we could design (read: control) experiences for such a wide variety of people. By this, I understood that they meant that we couldn’t design experiences that others would move through in the exact way that we imagined, and that we could not evoke personal memories in order to trigger emotional and deeper reactions in order to feel something we intended. And, if this was to be the definition of delivering experiences, perhaps they were right. Or considering the movie Ratatouille, if a rat can, maybe we can too.

At the same time this conversation was going on, Martha Stewart was building an empire by helping people create better, more meaningful weddings and dinner parties. Weren’t her customers learning something about creating experiences for others? When we went to theaters or great restaurants, were we ready to proclaim that 1) these weren’t experiences or 2) we couldn’t teach people to make moving theater or meaningful dining? Plenty of people were saying the same thing about websites. The film and culinary worlds would have laughed at our reluctance (had we bothered to consult them and bring them into our community). Perhaps they would have branded us cowards.

I believe the term (and industry) of experience design narrowly dodged a bullet that almost killed it in 2001 or 2002, collapsing under the weight of its own self-importance. It was a big fish in a small pond. For the most part, the only people calling themselves experience designers back then were in the digital fields. Even though we worked with clients and colleagues who were engineers, branders, business strategists, marketers, and chief officers of everything, we were afraid to color outside our own little box of the Web.

Two important things happened at the turn of the century; the dot-crash and the rise of mobile. Perhaps if the Web had continued to rule, the term “experience design” would have probably faded into interface design. This is useful and important work, certainly, but how can button placing compare to shaping an experience that might inspire joy, giddiness, and empowerment? But instead we’ve grown in our power and insight into User Experience Professionals… to the point when a major professional organization renamed itself.

Humans have always created experiences for others: i.e. birthday parties, weddings, films, theater, art, speeches, hospitality, and more. Whether they were deliberately designed as experiences or not, they all delivered experiences. When the experience isn’t considered but works nonetheless, we chalk it up to intuition or good luck. Or we could end up with a bad experience. That’s not a desirable, or professional, way to work in the world.

Considering experience as we design is not that new. Louis Cheskin, probably the first experiential marketer, was researching experiences (including emotions and core meanings) back in the 60s. Walter Dorwin Teague, probably the first experience designer, was designing experiences across media despite never being trained to do so (if you can get a copy of his book, Design This Day, you can read how and why).

From Shedroff's excellent SXSW presentation on scifi's influence on designIt’s also arrogant to believe that we can’t learn from theater or retail or any other human domain how to improve the things we design and deliver. In my own professional experience, I’ve learned lessons from my colleagues and friends in medicine, sports, music, and especially theater. I’ve learned valuable lessons about interaction design from improv, biology, and even science fiction. I’ve learned about color, lighting, and music from, yes, Cirque du Soleil. I’ve learned about designing emotionally, and developing meaningful experiences from psychology. I’ve learned about systems design and stakeholders from sustainability. In fact, in the world of sustainable systems, we learn from nature itself.

Lately, I’ve been learning about how to develop and deliver better experiences more effectively over a larger timeline from the music composition and gaming worlds. I don’t understand why it was once deemed illegitimate to look to these sources for ideas, inspiration, and useful lessons. But, perhaps it’s moot now, as it no longer seems to be an issue and new generations of designers simply aren’t interested in this controversy.

So let’s move on. Let’s have more discussions about where we’re going. Experience design seems pretty stable, both in its scope and practice. We’re constantly adding to the knowledge and developing new tools to express the development and delivery of experiences to all involved with their creation. We’ve come a long way in ten years, sure, but every day environmental and biological sciences push forward our understanding of human behavior and the world we live in. This means we have new discoveries of how to design amazing experience still ahead of us . Designers need to learn more about designing sustainably, humanistically, and systemically. We need to further refine our techniques for design and customer research, enlarging our understanding of people past emotions and into values and meaning. We shouldn’t be afraid to go in these directions. Designing new experiences in new ways has a higher risk of failure, but also a higher risk of reward in greater impact and behavioral change.

Lastly, we need to better understand business language, issues, and concerns. To have the influence we think we should, we need to enlarge the solutions we create so that they can operate effectively in the economic and political systems of business. Experience isn’t just something that gets imagined and designed. It gets funded, delivered, and managed. This is one of the reasons I earned my own MBA and then started a wholly new business program for those interested in leading innovation from the inside. Experience design is just one more system we need to understand to work professionally and to successfully develop and deliver better products, services, events, and environments.

The future of experience design has never held more promise. But, to fulfill this promise, we have to explore, learn, and work passionately and confidently—even courageously, at times—in new domains. The things we create aren’t usually any less ephemeral than the experiences they deliver (how many websites or campaigns or apps or events have you created in your career that are no longer available?). What lasts, at least in the minds and reactions of our customers, are the experiences around these things. Ultimately, this is also where we derive our own greatest satisfaction in our work. It will be what makes us smile when we think of a project we worked on, years from now, and instead of focusing on how we created it or how much we earned; we will fondly look back on the experiences they created for people.

Computer Human Values

Written by: Nathan Shedroff
“This is not a crisis of technology or computing power, but one of imagination, understanding and courage.”Computers and related devices need to be more human

As computers and digital devices increasingly insert themselves into our lives, they do so on an ever increasing social level. No longer are computers merely devices for calculating figures, graphing charts, or even typing correspondence. When producers of the first personal computers initially launched them into the market over 20 years ago, they could think of no better use for them than storing recipes and balancing one’s checkbook. They couldn’t predict how deep computers (and related devices) would seep into our lives.

Computers have enabled cultures and individuals to express themselves in new and unexpected ways, and have enabled businesses to transform how, where, when and even what business they do. However, this rosy outlook has come at a price. Computers have become more frustrating to use. In fact, the more sophisticated the use, the application, the interface and the experience, the more important it is for computers and other digital devices to integrate fluidly into our already-established lives without requiring us to respond to technological needs. Also, the wider-spread these devices, the more socially-agile they need to be in order to be accepted.

Interfaces must:

  • Be more aware of themselves.
  • Be more aware of their surroundings and participants/audiences.
  • Offer more help and guidance when needed, in more natural and understandable ways.
  • Be more autonomous when necessary.
  • Be better able to help build knowledge as opposed to merely processing data.
  • Be more capable of displaying information in richer forms.
  • Be more integrated into a participant’s workflow or information and entertainment processes.
  • Be more integrated with other media.
  • Adapt more automatically to behavior and conditions.

People default to behaviors and expectations of computers in ways consistent with human-to-human contact and relationships.

Ten years ago, when the computer industry was trying to increase sales of personal computers into the consumer space, the barrier wasn’t technological, but social. For the most part, computers just didn’t fit into most people’s lives. This wasn’t because they were lacking features or kilohertz, it was because they didn’t really do much that was important to people. It wasn’t until email became widespread and computers became important to parents in the education of their children that computers started showing up in homes in appreciable numbers. Now, to continue “market penetration,” we’ll need to not just add new capabilities, but build new experiences for computers to provide to people that enhance their lives in natural and comfortable ways.

If you aren’t familiar with Cliff Nass’ and Byron Reeves’ research at Stanford, you should be. They showed (and published in their book Media Equation) that people respond to computers as if they were other people. That is, people default to behaviors and expectations of computers in ways consistent with human-to-human contact and relationships. No one is expecting computers to be truly intelligent (well, except the very young and the very nerdy), but our behaviors betray a human expectation that things should treat us humanely and act with human values as soon as they show the slightest sophistication. And this isn’t true merely of computers, but of all media and almost all technology. We swear at our cars, we’re annoyed at the behavior of our microwave ovens, we’re enraged enough to protest at “corporate” behavior, etc. While on a highly intellectual level we know these things aren’t people, we still treat them as such and expect their behaviors to be consistent with the kind of behavior that, if it doesn’t quite meet with Miss Manner’s standards, at least meets with the standards we set for ourselves and our friends.

We should be creating experiences and not merely “tasks” or isolated moments in front of screens.

Experiences happen through time and space and reflect a context that’s always greater than we realize. Building understanding for our audience and participants necessarily starts with context, yet most of our experiences with computers and devices, including application software, hardware, operating systems, websites, etc. operate as if they’re somehow independent of what’s happening around them. Most people don’t make these distinctions. How many of you know people who thought they were searching the Web or buying something at Netscape five years ago? Most consumers don’t distinguish between MSN, Windows, Internet Explorer, AOL and email, for example. It’s all the same to them because it’s all part of the same experience they’re having. When something fails, the whole collection is at fault. It’s not clear what the specific problem might be because developers have made it notoriously difficult to understand what has truly failed or where to start looking for a solution.

We need to rethink how we approach helping people solve problems when we develop solutions for them. We need to realize that even though our solutions are mostly autonomous, remote, and specific, our audiences are none of these. They exist in a space defined in three spatial dimensions, a time, a context, and have further dimensions in play corresponding to expectations, emotions, at least five senses, and real problems to solve—often trivial ones, but real nonetheless.

Most of you probably create and use user profiles and scenarios during development to help understand your user base. These are wonderful tools, but I have yet to see a scenario that includes someone needing help. I’ve never seen a scenario with a truly clueless user that just doesn’t get it. Yet, we’ve all heard the stories from the customer service people, so we know these people exist. When you pull out those assembly instructions or operating instructions or even the help manual, they really don’t help because they weren’t part of the scenario or within the scope of the project (because the help system never gets the same consideration and becomes an afterthought). They may not be part of the “interface,” but they are part of the experience.

This is what it means to create delightful experiences, and is a good way of approaching the design of any products or services. What delights me is when I’m surprised at how thoughtful someone is, how nice someone is in an adverse situation, and when things unexpectedly go the way I think they should (which is most likely how I expect a person to act).

“What we need are human values integrated into our development processes that treat people as they expect to be treated and build solutions that reflect human nature.”Interfaces must exhibit human values

Think about how your audience would relate to your solution (operating system, application, website, etc.) if it were a person.

Now, I’m not talking about bringing back Bob. In fact, Bob was the worst approach to these ideas. He embodied a person visually and then acted like the least courteous, most annoying person possible. But this doesn’t just apply to anthropomorphized interfaces with animations or video agents. All applications and interfaces exhibit the characteristics that Nass and Reeves have studied. Even before Microsoft Word had Clippy—or whatever that little pest is called—it was a problem. Word acts like one of those haughty salesclerks in a pricey boutique. It knows better than you. You specify 10-point Helvetica but it gives you 12-point Times at every opportunity. It constantly and consistently guesses wrong on almost every thing. Want to delete that line? It takes hitting the delete key three times if the line above it starts with a number, because of course it must, must be a numbered list you wanted. You were just too stupid to know how to do it. Interfaces like that of Word might be capable in some circumstances, but they are a terrible experience because they go against human values of courtesy, understanding and helpfulness, not to mention grace and subtlety.

So when you’re developing a tool, an interface, an application or modifying the operating system itself, my advice throughout development and user testing is to ask yourself what type of person is your interface most like? Is it helpful or boorish? Is it nice or impatient? Is it dumb or does it make reasonable assumptions? Is it something you would want to spend a lot of time with? Because, guess what, you are spending a lot of time with it, and so will your users.

I don’t expect devices to out-think me, think for me, or protect me any more than I expect people to in my day-to-day life. But I do expect them to learn simple things about my preferences from my behavior, just like I expect people to in the real world.

Human experiences as a model

When developers approach complex problems, they usually try to simplify them; in other words, “dumb them down.” This is usually a failure because they can’t, really, take the complexity out of life. In fact, complexity is one of the good things about life. Instead, we should be looking for ways to model the problem in human terms, and the easiest way to do this to look at how humans behave with each other—the good behaviors, please. Conversations, for example, can be an effective model for browsing a database (show example). This doesn’t work in every case, but it is a very natural (and comfortable) way of constructing a complex search query without overloading a user. And just because the values are expressed in words doesn’t mean they can’t correspond to query terms or numerical values. An advanced search page is perfectly rational and might accurately reflect how the data is modeled in a database, but it isn’t natural for people to use, making it uncomfortable for the majority, despite how many technologically-aware people might be able to use it. There is nothing wrong about these check-box-laden screens, but there is nothing right about them either. We’ve just come to accept them.

God is in the details

As Mies van der Rohe said, “God is in the details.” Well, these are the details and the fact that they’re too often handled poorly means that technological devices are ruled by a God that is either sloppy, careless, absent-minded, inhuman, or all of the above.

This isn’t terribly difficult but it does take time and attention. And we don’t need artificial intelligence, heads-up displays, neural nets, or virtual reality to accomplish it. There is a reason why my mother isn’t a fighter pilot—several, in fact. But the automobile industry in the U.S. spends tens of millions of dollars each year trying to develop a heads-up display for cars. That’s all my mother needs—one more thing to distract her from the road, break down someday, and scare her even more about technology and making a mistake. What we need are human values integrated into our development processes that treat people as they expect to be treated and build solutions that reflect human nature.

Everything is riding on this: expansion into new markets, upselling newer and more sophisticated equipment, solving complex organizational problems, reducing costs for customer service, reducing maintenance costs, reducing frustration, and (most of all) satisfying people and helping them lead more meaningful lives. Companies fail to differentiate themselves anymore on quality or tangibles. Instead, they try to differentiate themselves on “brand.” What marketers and engineers often don’t “get” is that the only way to differentiate themselves on brand is by creating compelling experiences with their products and services (and not the marketing around them). Niketown and the Apple Stores would never have succeeded—at least not for long—had they not been selling good product experiences. This isn’t the only reason the Microsoft store failed (a tourist destination for buying Microsoft-branded shirts and stationery really wasn’t meeting anyone’s needs), but it was part of it. Gateway, in comparison, has been much more successful, though they still aren’t getting it quite right.

The Apple Store is a good example. You can actually buy things and walk out with them (unlike the Gateway stores which really disappoint customers by breaking this social assumption). What’s more, anyone can walk in, buy a DVD-R (they come only in 5-packs, though) and burn a DVD on the store equipment. Really, I’ve done it. I may be the only person who has ever taken Steve Jobs up on this offer, but it is a very important interaction because most people aren’t going to have DVDRs for awhile—and neither are their friends. Most people don’t even have CDRs, but if they want to burn a DVD of their children’s birthday party to send to the grandparents, what else are they going to do? This recognition of their users’ reality is what made Apple’s approach legendary (not that it hasn’t been tarnished often). It’s not a technological fix, it’s not even an economic one. In this case, access is the important issue and allowing people to walk in off the street, connect their hard drive or portable, and create something with their precious memories became the solution. It works because it supports our human values (in this case, sharing). It works because this is what you would expect of a friend or someone you considered helpful. This is not only a terrific brand-enhancing experience, it jives with our expectation about how things should be and that is what social and human values are all about.

This is not a crisis of technology or computing power, but one of imagination, understanding and courage. I would love to see designers create solutions that felt more human in the values they exhibited. This is what really changes people’s behaviors and opinions. Just wanting things to be “easy to use” isn’t enough anymore—if it ever was. If you want to differentiate your solution, if you want to create and manage a superior customer relationship, then find ways to codify all those little insights experts have, in any field, about what their customers need, desire, and know into behaviors that make your interfaces feel like they’re respecting and valuing those customers. This is the future of user experiences, user interfaces, customer relations and it’s actually a damn fine future.

For more information

  • Microsoft Bob was a “personal information manual” Microsoft built around Nass and Reeves’ research. Bob was a personified (read “anthropomorphized) character that represented the application. He came with a cadre of associates who could be chosen instead of Bob by users based on the personality characteristics that felt more comfortable. Fair enough. Bob’s downfall, however, is that no matter which character the user chose, they were all too prominent and annoying, failing to understand that the times they were needed and desired were drastically less than their programming assumed and that their personalities raised our expectations far beyond what they were capable of delivering.
  • Media Equation by Cliff Nass and Byron Reeves. C S L I Publications, June 1999.
Nathan Shedroff has been an experience designer for over twelve years. He has written extensively on the subject and maintains a website with resources on Experience Design at www.nathan.com/ed. He can be reached at .

The Making of a Discipline: The Making of a Title

Written by: Nathan Shedroff

This year I published a book, titled ‘Experience Design‘, based on not so much an emerging field but an emerging mindset: a growing awareness that the most powerful experiences cross traditional professional boundaries, and that we as designers of experiences must pursue our work with the big picture in mind. Indeed, effective Experience Design encompasses myriad fields, from online to desktop, from print to exhibits, from interaction design to copywriting, from brand management to theme park ride design.

Information design was clearly a brave, new field—and the new titles—“instructional designer” or “interface designer”—sounded perfect for the future of the Information Age.

The shining examples of pan-media experience design—Disney, Nike, Coca-Cola and Star Trek to name a few—might make this seem straightforward. However, many people who work within the design field have had a hard time assimilating the full scope of Experience Design—and a harder time accepting their niches within it. The reasons for this resistance uncover much about the state of design as well as the state of identity—that’s personal identity, not corporate identity.

A title is born

A little history might help here. Around 1989 or 1990, back in the days before an interactive media industry—yes, before QuickTime even—there was a very small community of information designers. Most of these people came from the print world. They worked on a variety of projects, including complex signage, directories, catalogs and information systems. Many of these designers bore the titles “instructional designer” or “interface designer.”

The larger design community had trouble understanding and accepting this field, as it was decidedly more obscure and conceptual than traditional graphic design. However, information design was clearly a brave, new field—and the titles sounded perfect for the future of the Information Age. The more savvy traditional designers learned new techniques and applied them to these new concerns, but many others simply adopted the titles without learning much of anything.

Unfortunately, this was not the last time designers would update their business cards without a commensurate upgrade in skills.

The information design community owes its founding largely to Richard Saul Wurman. He was the first to identify the issues of clarity, meaning and understandability in the print world, as well as some of the techniques designers could use to organize data and create information (as in informing). He communicated these principles both inside and outside the design community, and he firmly established information design as a measurable benefit to both communication and business. Through his company, TheUnderstandingBusiness (which was established in 1987 and where I was fortunate to work for a few years) he and his designers defined many of the techniques and processes that would become information design.

The inclusiveness of the term Information Architect was illustrated by the diverse collection of media, styles and techniques in Wurman’s book by the same name.

To be sure, there were others practicing what can be considered information design. Siegel & Gale, a design firm based in New York City, was redesigning and rewriting documents and forms—even tax forms—to make them easier to use (they called this approach “plain English”). Edward Tufte had written the successful book, “The Visual Display of Quantitative Information,” and Massimo Vignelli had declared himself an information designer as well. Things were looking good—or maybe “clear” is a more appropriate word here.

However, fairly quickly, many visual designers who merely wanted to decorate data (think chartjunk) also declared themselves information designers. As I remember, the information design world was fairly accepting. If there had been an information design table, places were available for everyone who wanted a seat. Information designers didn’t exactly equate visual styling with hard-core information design, so they might have seated visual designers at the end of the table. But at least everyone was included at the table.

About this time, Wurman started using the term Information Architect, a rearrangement of the phrase Architecture of Information, which he coined at the 1972 Aspen Design Conference. In terms of skills, practice, process and expectations, the term Information Architect described the existing fields of information design and visual design. It was simply a new label invented for the purpose of elevating the profession as a whole in the eyes of a population that wasn’t particularly design savvy. The inclusiveness of the term Information Architect was illustrated by the diverse collection of media, styles and techniques in Wurman’s book by the same name.

About this time the internet started commanding the majority of work in the interactive industry. Luckily, my company, vivid studios, as well as a few others (such as Clement Mok Designs) had already translated our information design skills from print to interactive media. Information design was already part of our development processes. Of course, we had to teach every client what information design was, what it accomplished, and why it had to be in the budget. We published widely on our sites not only our job descriptions and processes, but also our theories. This is how information design crossed into the interactive world, where it was wholeheartedly accepted and has been firmly rooted ever since.

Gold rush

It didn’t take long for people with innate skills and applicable experience to find their way into the interactive field, but it was still one of the rarest of professions since no one could find classes, let alone degrees, in information design. Eventually the flood of dotcom startups required so many information designers that anyone who could draw a flowchart was soon hired and given the title (to the eventual dismay of many clients).

It’s a sad state of affairs when each company—and potentially each freelancer and consultant—reinvents a new vocabulary simply to call their own, while further confusing clients and the world-at-large just at the moment we should be clearly communicating who we are and what we do.

I guess it’s inevitable with a fast-growing field that the very people who were pouring in from other places began to rapidly mark not only their turf, but everyone else’s as well. About two years ago, the slight schism between visual decoration and information design opened into a gulf between the information architects, who claimed the best, most strategic and most cognitive aspects of information design, and the information designers, who were relegated by these titans to follow tactical instructions, perform menial tasks, and, generally, make the least contributions to the structuring of information and experiences. Make no mistake here, this was a political and strategic attempt to elevate a strata of people who would, hopefully, become the elite of the information designers: The architects were to designers as traditional architects were to interior designers.

I have witnessed many times the attempts of information architects to trump information designers simply by title alone—as if anyone actually understands the difference between the two. In fact, almost all processes, techniques and tasks are shared. (The only useful differentiation between the roles occurs at the personal level, where each person’s skills must be weighed against a project’s requirements. This, of course, is exactly the point where differentiation makes sense.)

There seems to be an opinion that information architecture applies exclusively to online media, that offline media can’t possibly pose problems as complex or as important. For sure, many large online projects can get complex, but I have yet to encounter an online project as complex or important as some of those I saw at TheUnderstandingBusiness. I also see information architects rushing to define the field in steps and techniques that are tactical at best. Most of the designers I worked with—and was taught by—at TheUnderstandingBusiness still approach problems from a higher conceptual level (and generate much more sophisticated and original solutions) than most of the architects in a hurry to separate themselves at the top of the profession. And most in this former group still go by the title, information designer.

That brings us back to Experience Design—or is it Architecture? For a field that is barely even two years old, the exact same egomaniacal process is starting but, this time, with even less substance. I sat through a presentation last year of Experience Architecture which, as far as I could tell, had no new insights, processes, or techniques to offer other than what would already be covered (or uncovered) in Experience Design. The only reason for this title was to differentiate this one company’s offering. It’s a sad state of affairs when each company—and potentially each freelancer and consultant—reinvents a new vocabulary simply to call their own, while further confusing clients and the world-at-large just at the moment we should be clearly communicating who we are and what we do.

Can you imagine a group of Fashion Architects declaring their supremacy over Fashion Designers? Yes, that’s what we’ve come to. We don’t yet have enough respect as it is from clients and engineers and we’ve almost completely lost the ear of corporate leaders. Imagine if they found out how shallow and vain the profession is turning?

While IA and ID battle each other for dominance, Visual (or Graphic) design seems to have already lost. Case in point, at the fourth annual AIGA Advance for Design workshop last year the following roles were identified for discussion:

  • Design Planner
  • Brand Strategist
  • User Researcher
  • Visual Systems Designer
  • Information Architect/Information Designer
  • Interaction Designer
  • Usability Specialist

You will not find “Visual Designer” or “Graphic Designer” in that list. The closest thing was Visual Systems Designer, which the organizers insisted is far more elaborate than mere graphic design. To make matters worse, the role of Visual Systems Designer was quickly perverted into Visual Information Designer, which became nearly synonymous with Information Architect, a separately identified role.

This circuitous examination may be pointless, but at least it isn’t frightening. What’s scary is the fact that there were no defined places for visual/graphic design, animation, interface design, typography, videography, sound design or any of the other important fields that synthesize all of the decisions and breathe life into the interface. At least one visual designer there started feeling there wasn’t a place for her at all in the community. Perhaps, in our need to define new horizons, we’re forgetting our roots.

What’s in a name

As a field trying to define ourselves, we’ve already elevated our status so far that we don’t have time for tactics or work. Only the most strategic of activities and the most important thoughts warrant our attention.

I hate the word “creative” as anything but an adjective modifying a noun worth modifying.

OK, it may not be this bad yet, but it’s certainly the direction we’re heading. Imagine discounting the joy of visual expression—the satisfaction that comes from balancing the cognitive, engineering, and emotional goals of a project so well that their recognition falls away and all that is left is a powerful visual solution. Imagine telling audio engineers and videographers (also key partners in the creation of many experiences) they aren’t a part of the process unless they can describe themselves as audio strategists and video systems designers. Now imagine trying to finish a project yourself after these professionals have left in disgust.

We started calling our “creative” group at vivid the Experience Group in 1994, partly for these reasons. We adopted the new name because it had the right mix of ambiguity and newness that stunned people long enough to hear our definition, and it avoided many of the problems with other names—especially “Creative Group.”

I hate the word “creative” as anything but an adjective modifying a noun worth modifying. When used in this sense, “we need to get some creative” or “we should hire some creatives,” the word marginalizes and devalues the contributions that front-end and “artsy” people make. When people actually refer to themselves as “creatives,” I pity them. I learned a long time ago that everyone in a company better be creative and that the most creative person at vivid was the CFO.

All of this reminds me of my experiences at the CHI (Computer Human Interface) conferences. CHI is a special interest group within the ACM (Association for Computing Machinery). It was nearly impossible to get a design-oriented paper, panel, or speech accepted as part of the CHI program.

For the most part, the only people deemed fit for the program were a) well-known members who happened to be designers or researchers and b) interface specialists who were now turning their attention to “design.” Courses, papers, and panels reviewed by the CHI leadership routinely came back with comments like “is this important?,” “isn’t there a better conference for these issues?,” “this doesn’t seem to be in the scope of CHI,” and “there isn’t much of scientific value here.”

I stopped going to CHI conferences in 1990. It was apparent that the ruling class not only couldn’t recognize new fields and techniques in design, but wouldn’t.

Experience Design is threatened by the same sort of shortsightedness and exclusivity. Are we going to succumb to infighting, name-calling, and endless arguments over definitional minutia, or are we going to expand our sights—and our boundaries—to include all of the elements we need to create dazzling—and valuable—experiences?

The most eloquent description of Experience Design I’ve read comes not from the design world but from a New York City restaurant reviewer named Gael Greene. In an interview with Matthew Goodman in the June 2001 issue of Brill’s Content, she said:

“I thought a restaurant review should describe what your experience was like from the moment you called to make a reservation. Were they rude? Did they laugh at you for trying to get a table? …”

That’s what it’s all about: the complete experience, beginning to end, from the screen to the store, to the ride and beyond.

Lee McCormack assisted with this piece. He is a writer, editor and information architect/designer/whatever. He currently plies his trade at AltaVista.