The desert was frigid and the sun was just peeking over the mountains when we arrived at the Mojave Air & Space Port. A full-scale rocket prototype at the end of a long driveway marked the spaceport’s entrance. I was just a walk away from realizing a small part of a big dream: being part of space exploration.
Space travel and participation in the space economy are pretty much impossible for all but the most elite scientists, astronauts, and billionaires. XCOR, an aerospace company with unique methane-based rockets and a reusable space plane, enlisted my firm to design a completely immersive experience and bring the exhilaration of space travel alive for students, pilots, and investors.
The project required the transportive power of virtual reality and an entirely new approach to storytelling rooted in empathy. Our solution: a virtual mock-up of the rocket’s cabin, in a spherical projection of low Earth orbit, that could be experienced through a virtual reality headset. Guided by realistic physics, pilots could navigate the ship in any direction and, for a few moments, experience spaceflight.
Building a new reality
VR is a lens to places you can’t normally go, whether that be the front line of a war, the bottom of the ocean, or 70 miles above Earth’s surface. In the case of the latter, the headset becomes much more than a technology. It’s a door to the final frontier.
More than any other medium, VR can inspire empathy for the individuals whose experiences are recreated. Project Syria, for instance, transports users directly onto an Aleppo street besieged by bombs. Experiencing another’s reality firsthand is a powerful tool for change — one we’re just beginning to explore.
But designing VR in a way that realizes the medium’s full potential takes more than just technical skill. It takes empathy to fully recreate another person’s reality and bring the experience alive for users. Immersive VR design also requires an acute understanding of human behavior and an understanding of users’ desires and needs. To ensure you’re designing with empathy, follow these simple strategies:
1. Have a point of view.
VR as a tool for creating empathy requires a point of view. You’re taking your audience on a journey, and you as the designer are guiding the experience. In any way possible, immerse yourself in the subject matter you’re trying to recreate.
Observation and user diaries are two effective methods for capturing the firsthand point of view. The virtual experience we designed for XCOR, for instance, was translated from firsthand accounts from XCOR’s team. I worked with astronaut Brian Binnie to understand what spaceflight would look, feel, and sound like inside the company’s proposed rocket-powered spaceplane. I saw inside the cockpit and spoke directly to almost every person involved, from the operations team to the engineers. Forming a point of view requires an empathetic approach to the work at hand.
2. Understand the medium.
Most designers are accustomed to 2D interfaces, but VR isn’t linear. It’s spherical in every sense. Audio can be experienced from any angle, and environments can be explored in any direction.
Your audience can choose to walk through a door, turn a corner, or sit down. What happens when they do? Learn everything you can about the methods and tools for creating new kinds of experiences, and don’t assume what you’ve learned in the past will apply to 3D environments.
3. Understand the technology.
VR isn’t everywhere yet, so our exposure to it as designers remains limited. Apps such as The New York Times VR app, Google’s Cardboard, and Roundme are powerful platforms for experiencing the technology. Headsets are powerful but also bulky and far less common, so explore the capabilities and limitations of mobile before assuming headsets and powerful computing platforms are necessary. For most design projects, broad exposure will trump performance because brand impact is strongest when access to an experience is widespread.
4. Prototype everything.
No matter the technology you’re designing for, prototype everything so you can experience how it will perform and explore unintended consequences in a cost-effective manner. In my firm, we’ve created cardboard iPhones and paper strips that represent interactive screens. We project interfaces to simulate gestural touch screens. VR environments can be simulated easily to explore nonlinear content creation. You’re living in the reference shot, so use the world around you to set up your environments and plan your experiences.
5. Consider the entire user experience.
VR is an isolating technology. Be aware of the barrier you’re putting between your audience and the physical world, and design stories and experiences that the audience can jump in and out of quickly.
Because VR can be disorienting and physically demanding, take posture, position, and motion into account. Turning around and looking behind you isn’t as easy as it seems when you’re sitting in a chair with limited sensory input, with your eyes and ears occupied by headphones and close-proximity screens.
Designing a fully immersive experience is difficult, but the payoff is immense — if you design with empathy. Strive to fully understand the way your users think and work. Understand how they might be different and how they’ll feel moving about in the world you’ve created.
That’s right! I said it. For us (designers, information architects, interaction designers, usability professionals, HCI researchers, visual designers, architects, content strategists, writers, industrial designers, interactive designers, etc.) the term user experience design (UX) is useless. It is such an over generalized term that you can never tell if someone is using it to mean something specific, as in UX = IxD/IA/UI#, or to mean something overarching all design efforts. In current usage, unfortunately, it’s used both ways. Which means when we think we’re communicating, we aren’t.
Of course there is UX for us
If I was going to define my expertise, I couldn’t give a short answer. Even when UX is narrowly defined, it includes interaction design (my area of deep expertise), information architecture (a past life occupation), and some interface design. To do it well, one needs to know about research, programming, business, and traditional design such as graphic design as well. Once, to do web design you had to be a T-shaped person. This is defined as a person who knows a little bit about many things and a lot about one thing. Imagine a programmer who also understands a bit about business models and some interface design. But as our product complexity grows, we need P and M shaped people–people with multiple deep specialties. To design great user expereinces, you need to specialize in a combination of brand management, interaction design, human-computer factors and business model design. Or you could be part of a team. The term UX was welcomed because we finally had an umbrella of related practices.
Of course, we don’t all belong to the same version of that umbrella. We all bring different focuses under the umbrella, different experiences, mindsets, and practices. While we can all learn from each other, we can’t always be each other.
But trouble started when our clients didn’t realize it was an umbrella, and thought it was a person. And they tried to hire them.
It isn’t about us
If there is any group for whom UX exists now more than ever it is non-UXers. Until 2007, the concept of UX had been hard to explain. We didn’t have a poster child we could point to and say, “Here! That’s what I mean when I say UX.” But in June 2007, Steve Jobs gave us that poster child in the form of the first generation iPhone. And the conversation was forever changed. No matter whether you loved, hated, or could care less about Apple, if you were a designer interested in designing solutions that meet the needs of human beings, you couldn’t help but be delighted when the client held up his iPhone and said, “Make my X like an iPhone.”
It was an example of “getting user experience right.” We as designers were then able to demonstrate to our clients why the iPhone was great and, if we were good, apply those principles in a way that let our clients understand what it took to make such a product and its services happen. You had to admit that the iPhone was one of the first complete packages of UX we have ever had. And it was everywhere.
Now five years later, our customers aren’t saying they want an iPhone any more. They are saying that they want a great “experience” or “user experience.” They don’t know how to describe it, or who they need to achieve it. They have no clue what it takes to get a great one, but they want it. And they’ll know it when they see it, feel it, touch it, smell it.
And they think there must be a person called a “user experience designer” who does what other designers “who we’ve tried before and who failed” can’t do. The title “user experience designer” is the target they are sniffing for when they hire. They follow the trail of user experience sprinkled in our past titles and previous degrees. They sniff us out, and “user experience” is the primary scent that flares their metaphorical nostrils.
It is only when they enter our world that the scent goes from beautiful to rank. They see and smell our dirty laundry: the DTDT (Defining The Damn Thing) debates, the lack of continuity of positions across job contexts, the various job titles, the non-existent and simultaneously pervasive education credentials, etc. There is actually no credential out there that says “UX.” Non! Nada! Anywhere. There are courses for IxD, IA, LIS, HCI, etc. But in my research of design programs in the US and abroad, no one stands behind the term UX. It is amorphous, phase-changing, and too intangible to put a credential around. There are too many different job descriptions all with the same title but each with different requirements (visual design, coding, research being added or removed at will). Arguably it is also a phrase that an academic can’t get behind. There aren’t any academic associations for User Experience, so it’s not possible to be published under that title.
Without a shared definition and without credentialed benchmarks, user experience is snakeoil. What’s made things even worse is the creation of credentialed/accredited programs in “service design” which take all the same micro-disciplines of user experience and add to it the very well academically formed “service management” which gives it academic legitimacy. This well defined term is the final nail in the coffin, and shows UX to be an embattled, tarnished, shifty, and confusing term that serves no master in its attempt to serve all.
“User experience design” has to go
Given this experience our collaborators, managers, clients and other stakeholders have had with UX; how can we not empathize with their confused feelings about us and the phrase we use to describe our work.
And for this reason UX has to go. It just can’t handle the complexity of the reality we are both designing for and of who is doing the designing. Perhaps the term “good user experience” can remain to describe our outcomes, but user experience designer can’t exist to describe the people who do the job of achieving it.
Abby Covert said recently that the term UX is muddy and confusing. Well, I don’t think the term “user experience” is confusing so much as it’s a term used to describe something that is very broad, but is used as if it were very narrow. There is a classic design mistake of oversimplifying something complex instead of expressing the complexity clearly. UX was our linguistic oversimplification mistake. We tried to make what we do easy to understand. We made it seem too simple. And now our clients don’t want to put up with the complexity required to achieve it.
Now that the term has been ruined (for a few generations anyway), we need to hone our vocabulary. It means we can’t be afraid of acknowledging the tremendous complexity in what we do, how we do it, and how we organize ourselves. It means that we focus on skill sets instead of focusing on people. It means understanding our complex interrelationships with all the disciplines formerly in the term UX. And we must understand that they are equally entwined with traditional design, engineering and business disciplines, communities, and practices as they are to each other.
So I would offer that instead of holding up that iPhone and declaring it great UX, you can still use it as an example of great design, but take the simple but longer path of patiently deconstructing why it is great.
When I used to give tours at the Industrial Design department at the Savannah College of Art and Design (SCAD) I would take out my iPhone and use it to explain why it was important that we taught industrial design, interaction design, and service design (among other things). I’d point to it off and explain how the lines materials, and colors all combined to create a form designed to fit in my hand, look beautiful on my restaurant table, and be recognizable anywhere. Then I would show the various ways to “turn it on” and how the placement of the buttons and the gesture of the swipe to unlock were just the beginning of how it was designed to map the customer’s perception and cognition, social behaviors, and the personal narrative against how the device signalled its state, what it was processing, and what was possible with the device. And I explained that this was interaction design. Finally, I’d explain how all of this presentation and interaction were wonderful, but the phone also needed to attach a service to it that allows you to make calls, where you can buy music and applications and that the relationships between content creators, license owners, and customers.
At no time do I use the term “user experience.” By the time I’m done I have taught a class on user experience design and never uttered the term. The people have a genuine respect for all 3 disciplines explored in this example and see them as collaborative unique practices that have to work intimately together.There is no hope left in them for a false unicorn who can singularly make it all happen.
Like a lot of folks, I find the term “user experience design” awkward and unsatisfying, at once vague and grandiose, and not accurately descriptive of what I do. Too often it seems like a term untethered, in search of something — anything — we might use it to name. And yet I often call myself a UX designer, and have done for the last few years, because at the moment it seems to communicate what I do more effectively to more people than any other term I can find.
Obviously I don’t stand alone in finding the term useful, or at least useful enough. Yet we find ourselves endlessly discussing this and and other terms for what we do … trying to describe what we do … disagreeing vigorously … and at the same time complaining about getting mired in an argument about semantics. Can’t we just get on with the work?
I don’t think we can. We cannot get past this argument about language just yet because I don’t think we really have an argument about language. We have an argument about what we do, a genuine and profound disagreement.
Looking at where the term “user experience design” comes from, and how we actually use it, I have a proposal for what we can take it to mean: design which includes interaction design but is not only interaction design.
People who think of interaction design as just one among many UX specialties may consider that a surprising overextension of that specialty’s relevance; I hope to show why it makes sense.
Trouble with the definition, not the word
I don’t much care which words we ultimately choose. Yes, it would help to use language which no one could mistake or confuse, but we cannot seem to find that and don’t strictly need it anyway. Consider the ugliness and inappropriateness of the term “industrial design.” We understand it not because it suits what industrial designers do, but because we already understand what industrial designers do and can attach the name to that generally understood meaning.
In “user experience design,” we don’t have that. We lack a clear meaning to which we can attach the term. Until we find one, the grumbling over names will continue.
Some people like the grand implications of the term “user experience design.” They include anything where one plans what experience people will have, including not just websites but interior decoration and customer service scripts and theme park rides and kitchen knives.
I feel uncomfortable with the language of “user experience design” because I don’t think we need a name to describe all of those things. At that point, why not just “design”?
Looking back at how we came to talk about UXD in the first place, that large world of design problems didn’t give rise to talk of “user experience design.” The web did.
The web gave us UXD
The term “user experience design” came as a response to the shock wave created by the emergence of the web. For most people in the field, “user experience design” means, in practice, “design for the web … and other stuff like it.” So what is the web like?
Some people with a background in graphic design tend to think of web design as visual design plus a bunch of other Design Stuff. For a long time, a lot of web designers made a binary distinction between visual design and information architecture, effectively defining IA as “all the Design Stuff for the web which isn’t visual design.” These days, most define IA more crisply than that, distinguishing between information architecture as the organization of content and “interaction design” as … well … that gets a little tricky.
For some web designers, I suspect “interaction design” represents the frontier of web design as IA once did; having accounted for visual design and information architecture, “interaction design” means, in practice, the design on the web which ain’t either of those. Others have a more specific conception of what constitutes “interaction design.”
Over in the software development universe, people have long discussed “usability engineering” and “human factors” and “user interface design” and a host of other names for the same essential work. All of those terms have their problems: philosophical, rhetorical, political. You can locate me in the era and tradition I spring from by knowing that, in circles where I can expect people will understand me, I still prefer to call myself an interaction designer rather than a UX designer because I consider it a more usefully precise term.
When one encounters a computer, or a device, or any other system which has software in it, one enters into a dialogue with that system, a cycle of action and reaction. This includes both cycles of action between individuals and the system itself, and also cycles between different people as mediated by the system. Inter-action: action between people and systems, action between people and people. Systems containing software involve categorically more complex interactions than anything else we make, which gives those systems a unique character that calls for a distinct design discipline. Hence “interaction design.”
Back in the late ’90s the term “interaction design” got tangled up rhetorically because traditional advertising and design agencies used the term “interactive media” to describe the brochure-ware they made for the web.
More recently, many people have taken “interaction design” to mean only the pick-and-shovel work of wireframing and specifying workflows, not the fundamental product or service definition which lies behind the specific interaction behaviors.
Once upon a time I wanted “interaction design” to become the term which included all of this work defining new interactive systems. Things didn’t go that way.
Interaction design. Information architecture. Visual design. Information design. Social interaction design. Service design. We have people who find these disciplinary distinctions very useful, believing that they represent well-defined types of work with reasonably well-developed methods. We have people who see talking too much about these distinctions as territorialism and semantic games that get in the way of just doing the work. Some among those have a deep skepticism that these distinctions mean much at all: compared to the classical disciplines of graphic design, industrial design, et cetera, we do not — and perhaps can not — have well-established methodologies for the new problems which designers face today. They talk in terms of a kind of open-ended design sensibility and developing an eclectic toolkit of specific techniques.
We should not minimize the differences between these philosophies. When we do, the disagreement displaces itself into discussions of language. Rather than ask what “user experience design” really means — a question with no answer — we should ask instead what problem we use it to talk about.
“User experience design” creates an uneasy truce
The term “experience design,” originally proposed by people who rejected disciplinary distinctions, has acted to paper over the disagreement.
These early advocates saw “experience design” as a way to name a new era in which the old disciplinary distinctions between design problems had broken down and become less relevant. They talked excitedly about UX design in its grandiose sense.
Then Jesse James Garrett drew his famous diagram of “The Elements of User Experience,” name-checking several different classes of design problems and suggesting a way of looking at their relationships, writing “user experience” in large letters on the diagram as a name for the whole. People who valued disciplinary distinctions could look at the diagram and see them represented there. People who wanted to transcend disciplines could look at the diagram and see the implication that each lived as part of a greater whole, incomplete on its own. So that diagram exemplified conversations which brokered an implicit truce under the banner of “user experience design.”
But we still need to understand and talk about This Thing That We Do, and we still do not agree about it. If UXD means “Designing Stuff like the web” we have to ask what we mean by “like the web.”
Interactive systems, not just the web
The 800-pound gorilla that is the web confuses our thinking. Web-ness per se did not produce the need which gave birth to the term “user experience design.” It didn’t come from people making simple websites with static pages, it came from people making web applications. And now we see it adopted by people making desktop software and mobile apps and more. What do those have in common? The network? Static websites involved the network … and we also see people talking about UX design for stand-alone desktop computer applications. So no, the network does not unify these UXD domains.
Software ties these things together. The Thing The Web Is Like is software, and in fact that statement says it backward. Better to say many things derive their nature from software, for example the web. What makes software special? What makes it different from the artifacts created with industrial design? From the images created with graphic design? From websites of static pages?
More than just interaction design
One might call this focus on interactivity chauvinism on my part, since I come from interaction design.
Let me underline that I do not claim that interaction design constitutes the most important component of all UXD. Let us recognize service design and information architecture and visual design and social interaction design and all the other specific design disciplines we employ in solving UX design problems. Indeed, let us notice that in many cases other design disciplines outweigh the importance of interaction design in solving a UXD problem.
One may have a big retailer’s website and mainly need information architecture to organize the vast set of pages and visual design to make the pages appealing and aligned with the brand, with just a little bit of interaction design for the search and purchasing tools. One may have a member service process for an HMO which involves sophisticated service design and classical graphic design for communicating to members and just a little bit of interaction design for things like appointment setting tools.
I don’t want to make interaction design dominant over UX design but I do want to name it as essential to UX design. The presence of interaction design usefully defines “user experience design.” The term “user experience design” did not emerge from an encounter with the need for service design, information architecture, visual design, social interaction design, or any of the other problems we talk about in the UX design world. It emerged from the encounter with complex software behaviors and the interaction design challenges they present.
It makes no sense to ask what “user experience design” really means; it means whatever we use it to mean. We can ask what we need it to mean and how we already use it. I submit that we need a term for “designing systems that include interaction design”. And we already use “user experience design” to mean that now.
If we could agree on that, I might stop feeling so bad about calling myself a “user experience designer”.
Would you rather take a photo using your phone, a point-and-shoot camera, or a digital SLR? How you answer this question is probably a good indicator of your photographic expertise. If you snap casual shots, your phone or a point-and-shoot camera will probably suffice. If you’re a professional photographer, on the other hand, you probably prefer using an SLR that gives you control over the focus, aperture, and exposure.
Expertise significantly impacts how we seek information online. Just as novice and expert photographers prefer different tools, so novices and experts behave differently when searching for information. Understanding these differences will help us design better search interfaces for both groups of users.
There are experts, and then there are experts
User expertise exists on two levels. If you’re an avid photographer, your domain expertise in photography will be quite high: that is, you’ll be familiar with the terms and techniques of the trade. Each of us is likely a domain expert in a few areas, and a complete novice in others. A second aspect is technical expertise. Familiarity with how computers, the internet, and search engines work significantly impacts how users seek information. Consider these personifications of each quadrant of expertise:
* *Angela Baer*, since completing her MFA at Pratt 5 years ago, is quickly building a reputation as one of New York’s up-and-coming fashion photographers. In the office connected to her studio, Angela edits her photographs on two large monitors and top-end computer. She delivers the edited shoots electronically to her clients, and regularly updates her online portfolio and blog. Angela is highly proficient using her computer, and when it comes to photography, she’s a domain expert.
* Though officially retiring over 10 years ago after a successful career in banking, *William Hayes* still sits on the board of a number of financial institutions. From his Elizabethan cottage on the Kent coast, he uses a 5-year old computer to exchange emails and access financial reports, though he prefers doing business on the phone and keeping up with the world though The Financial Times. While William is a domain expert when it comes to finance, his technical expertise is lacking.
* 18-year-old *Fane Tomescu* helps run an internet cafe in Braşov, Romania. Having saved for over a year, Fane recently came across a car that he’s considering purchasing. But when the time came to arrange car insurance, Fane had no clue how things worked. He asked his parents and friends for advice, and then spent several hours comparing providers online. Fane is a technical expert, but when it comes to insurance, he’s a domain novice.
* *Claire Jones* is a 9-year-old from Colorado Springs. Her school is holding a science fair and Claire has decided to build a model of the solar system using styrofoam balls suspended with string. Having left her science textbook in her locker over the weekend she was meant to start building the model, Claire used the internet to lookup information on the order, size, and appearance of each planet. Though she did eventually find what she was looking for (with her parents help), Claire would be considered both a technical and a domain novice.
While either dimension of expertise is valuable, users are most likely to succeed when both are present. There are, however, a number of design guidelines which can help both novices and experts succeed in their pursuit of knowledge.
Image 2: An orienteer at the 2010 World Orienteering Championships in Trondheim, Norway. Photo by Torben Utzon.
Wayfinding is a challenge as old as humankind, but the discipline of orienteering originated in the Swedish military in the 1800s and is now a sport practiced throughout Scandinavia. Equipped with a map and compass, participants navigate between control points spread across many miles, making tradeoffs between distance and difficult terrain as they strive to complete the course in the shortest amount of time.
The strategies employed by novice users seeking information resemble the sport of orienteering.  Users with low levels of domain and technical expertise, typified by Claire Jones, share three main characteristics.
Novices tend to enter queries that use about half as many words as experts. Domain novices (like both Claire and Fane Tomescu), feel particularly unsure of which terms to use.
Novices perform more queries than experts, but look at fewer documents. Although they frequently reformulate their query, technical novices often suffer from an anchoring bias  and make only small, inconsequential changes.
Novices are much more likely than experts to hit dead ends and seek to get back to a previous state.
These behaviours result in an orienteering-like strategy where novices “test the waters” with a short, general query, quickly skim the top results returned, and immediately reformulate the query based on their improved knowledge of the subject. 
Design considerations for Novices
There are a number of design considerations which can help novice users succeed at orienteering. In particular, novices need help formulating their query, refining their query, and backing out of trouble.
As-you-type suggestions can help users get off on the right foot when they’re uncertain what to search for. Research has shown  that users are more capable of choosing a viable option from a list than they are of composing a question out of thin air. Autosuggest provides an opportunity to help users express specific terms (such as airports or stocks), and to suggest queries that other users have performed in the past.
Image 3: Autosuggest on Etsy.com
After users have performed an initial search, they may still need help refining the query. A list of related searches can help the user break out of their anchoring bias and help them arrive at the optimal set of results.
Image 4: Foodily.com place related searches on the same line as breadcrumbs
Avoid zero results
If the user is presented with no search results, he may be disheartened enough to give up his quest. Avoid zero-result screens if possible. Tools such as automatic spelling corrections and query expansion (using synonyms and lemmatisation, for instance) can help.
Image 5: Amazon.com’s handling of zero results
Because novices tend to take wrong turns, they often need help navigating back to a previous state. Breadcrumbs are an ideal solution because they communicate both the user’s current location, as well as how to go back.
Image 6: Breadcrumbs on Zappos.com
Image 7: In Star Trek, crew members of the USS Enterprise stand on transporter platforms to be beamed down to a nearby planet.
While novices orienteer, experts teleport. Akin to being teleported to a precise but distant location, users with high domain and technical expertise like Angela Baer tend to jump directly to their final destination.
Experts enter longer, more specific queries than novices. Domain experts like William Hayes often rely on their vocabulary of specific terminology, while technical experts such as Fane Tomescu are more likely than novices to use formatting techniques such as quotation marks in their queries (87% of experts compared with 47% of novices according to a 2000 study ).
Experts usually amend their queries less often than novices and move forward with a higher degree of confidence.
More Documents Examined
Experts tend to review more documents and follow a greater number of links within those documents. Domain experts are especially adept at quickly determining whether or not a given document is useful.
In essence, experts often construct queries using numerous highly specific words which act to teleport  them directly to a destination, cutting out the query reformulation often practiced by novices. After having arrived at a destination, experts are then likely to explore the surrounding territory.
Design considerations for Experts
Designing for experts involves facilitating their teleporting behaviour, helping them get to their destination as quickly as possible.
Technical experts like Fane are often willing to learn special commands in exchange for having greater control. Commonly supported operators include AND, OR, and quotes for searching for exact phrases.
Image 8: Wolfram Alpha is designed to understand domain-specific terminology and return computed answers.
Keyboard shortcuts can also increase the speed of interaction. Google, for instance, allows users to press the up/down arrow keys on the keyboard to traverse results, and press return to go to the URL of the selected result.
Image 9: Google places a caret beside the currently-selected result.
Filtering & sorting
Experts are more likely to engage with advanced sort and filtering controls than novices, including operations such as selecting ranges, filtering by format, or excluding certain terms (e.g. everything that includes “apples” but does not mention “oranges”).
Image 10: Getty Image’s Moodstream lets users search for stock photos using sliders.
As-you-type completion interfaces most often display query suggestions to users. However, another use case is to present actual results in the autocompletion interface, enabling users to skip the search results screen altogether and go directly to a specific document.
Image 11: Rather than suggesting terms to search for, Nutshell returns search results directly without needing to go to a separate page.
Result table of contents
Providing links to the top destinations within a result can reduce the number of steps required for the expert to reach his destination.
Image 12: Google sometimes provides links to the top-level pages within a given site.
Yin and Yang
While novices and experts practice two very different approaches to information seeking, it’s important not to overemphasis one at the expense of the other. As illustrated by the ancient Chinese symbol, understanding the behaviour of both novices and experts can help us design more informed, balanced search experiences.
The author would like to thank Cennydd Bowles for organising the UK writer’s retreat during which this article was written, as well as for the editorial guidance that he provided.
 Vicki L. O’Day and Robin Jeffries; “Orienteering in an Information Landscape”:http://www.hpl.hp.com/techreports/92/HPL-92-127.pdf
 Christoph Hölscher & Gerhard Strube; “Web Search Behavior of Internet Experts and Newbies”:http://www9.org/w9cdrom/81/81.html
 Marti A Hearst; “Search User Interfaces”:http://searchuserinterfaces.com/book/sui_ch3_models_of_information_seeking.html#section_3.5
 Morten Hertzum and Erik Frokjaer; “Browsing and Querying in Online Documentation”:http://www.cparity.com/projects/AcmClassification/samples/230570.pdf
 Christopher D. Manning, Prabhakar Raghavan and Hinrich Schütze, “Introduction to Information Retrieval”:http://www.cambridge.org/us/knowledge/location/?site_locale=en_US , Cambridge University Press. 2008.
 Jaime Teevan, Christine Alvarado, Mark S. Ackerman and David R. Karger; “The Perfect Search Engine is Not Enough”:http://people.csail.mit.edu/teevan/work/publications/papers/chi04.pdf
If your clients are not yet asking you to design transitions, they will likely do that on your next project. Transitions are hot, and not just because they entertain the eye. In confined mobile computing interfaces, on tablet devices or in complex virtual environments, transitions are an authentic, minimalist way of enabling way-finding, displaying system state and exposing crucial functionality – in short, they are key in creating a superior user experience.
Transitions as design elements
Since the 1980s, designers have been drawing wireframes to represent web pages and device interfaces.1 In the beginning, wireframes were static schematics that showed a single state of the page. With the emergence of dHTML in the 1990s, it became necessary to draw different states of specific dynamic page elements, so the designers adapted the wireframe methodology to document the beginning and end states of the dynamic elements. Still, designers and engineers had little or no control over what happened in between the beginning and end states — the browser or the operating system handled all transitions.
More recently, sophisticated mobile touch frameworks like iPhone, Android, Palm and Windows Mobile allowed unprecedented control over the speed and structure of the transitions, giving designers more tools with which to create a better experience in a confined mobile space.2 Simultaneously, on the web, dynamic platforms like Flash and Flex gained tremendous ground, making it possible for designers to think about and document transitions because those were now part of the customer experience.
With the release of the Apple iPad, the Age of Transition has come to its full potential. On the iPad, Apple takes full advantage of some of the principles and ideas the company previously explored and perfected using the iPhone. On the bigger iPad screen, transitions achieve a new level of detail and sophistication, making the device come alive, and become a powerful, integral part of the experience.
Transitions Require Thinking Differently
As Jonathan Follett writes in his article “Interfaces That Flow: Transitions as Design Elements”:http://www.uxmatters.com/mt/archives/2007/04/interfaces-that-flow-transitions-as-design-elements.php, 3 many UX designers approach projects from a combination of information architecture and interaction design. These disciplines involve thinking that is quite different from constructing the continuous linear narratives required to design and document transitions. Nevertheless, by borrowing freely from the lessons of early animators, it is quite possible to adopt the existing wireframes methodology to convey the structure and rhythm of a user interface transition.
The task consists of wireframing each of the important changes (or “key frames”) that occur during the transition and stringing a bunch of wireframes together in a storyboard. By documenting the key aspects of the transition, it is possible to share them with the larger team and try out different transitions designs. Documenting the transitions also allows us to step back and consider them in a larger context of a specific use case or overall goal of progressive engagement and immersion.
Understanding iPad Transitions
In order to be able to design and document transitions using storyboards, we have to first understand design principles that designers of transitions use to convey the desired meaning. Let’s take a look at the Apple, Inc. video in Figure 1 showing selected transitions from what is arguably the most popular iPad application today: iTunes. Although many different transitions are shown in the video, we will be specifically looking at the two of them: “opening the iTunes application” (0:17-0:20 min) and “opening album details” (0:30 -0:36 min).
Figure 1: Video of iTunes transitions on the iPad [“View larger version on YouTube”:http://www.youtube.com/watch?v=Z03PR_4Ln90]
Borrowing from Chet and Guy’s excellent Devoxx presentation “Animation Rules!”:http://www.parleys.com/#st=5&sl=1&id=1578,4 we can identify seven key principles that specifically apply to the animated transitions on the iPad:
# Component Relationship (background-foreground)
# Illusion (motion perception and perceptual constancy)
# Exaggeration (highlighting states, actions, and changes in state)
# Staging (camera view, lighting, focus)
# Acceleration and Deceleration (slow in and out)
# Metaphors (using real-world analogies to convey complex digital events)
# Simplicity (avoiding noise)
To understand how the seven principles above apply combine to make the transition work its magic, let’s do a step-by-step breakdown of the “opening the iTunes application” transition, shown in Figure 2.
Figure 2: Opening iTunes Application Step-by-Step
Using our seven key principles: Component Relationship (background-foreground)
This transition is essentially the process by which the iTunes application comes into the foreground, while the rest of the apps recede into the background. In the first row, the transition starts out with the home screen and apps icons firmly in the foreground. By the end of the row, we can see that the home screen recedes and darkens, while the iTunes app (represented by a white square) slowly comes into the foreground. By the second row, the background-foreground transition is essentially complete – we can see only the loading iTunes app against the black background.
Illusion (motion perception and perceptual constancy)
This transition creates its magic via an illusion of “flying into” the device, and eventually meeting the white square that represents the iTunes app. To accomplish this, the animation shows us “flying” through the layer of the apps icons on the home screen. The other app icons begin to “fly” to the sides of the screen in a circular pattern, as shown in row 1. The most interesting part of the illusion is the kind of “bait-and-switch”. If you look carefully, you’ll see that the app icons never make it off screen. Just before we “pass through the icons layer” and “witness” the icons “flying off screen”, the background goes completely black, and our attention is focused on the white rectangle. The illusion is complete.
Exaggeration (highlighting states, actions, and changes in state)
In this transition, the lighting effects are used to exaggerate the switch between the background and foreground. In the second row, the background goes completely black, to highlight the change in state. Exaggeration can also be used to warp the shape of an object to emphasize movement, as in is used more in the “genie” effects and transitions.
Staging (camera view, lighting, focus)
Subtle but powerful lighting is used throughout the transition as the primary means for focusing our attention on the foreground of the opening window of the iTunes app through subtle darkening of the background (principle 1). Lighting is also used to accomplish the bait-and-switch in the Illusion principle.
Acceleration and Deceleration (slow in and out)
Our brains know from experience that objects do not start running at top speed or “stop on the dime”. To make the movement more life-like, the animation accelerates into the movement very slowly, picking up pace in later screenshots, as evidenced by the increasing “smudginess” of the icons in the first row. Not surprisingly, the bait-and-switch happens in the fastest moment of the transition to pull of the illusion that the homepage icons actually “fly off screen”. The transition then slows down again in the last row to smoothly fade in the iTunes content elements, deliberately giving the auxiliary page elements and pictures time to “catch up” and making the page load appear smoother.
Metaphors (using real-world analogies to convey complex digital events)
The most effective transitions use real-life elements to provide a frame of reference which makes the animation more realistic. In this case, the icons on the home screen are moving to the sides, creating an overall illusion of moving through space, or deeper “into” the device itself to convey a digital event of opening an application inside the device.
Simplicity (avoiding noise)
The overriding theme of this transition is its apparent simplicity. During the transition, iTunes is not doing anything particularly complicated or earth-shattering. The magic comes not from one particular element, but through carefully blending and combining the lighting and movement to create a smooth cohesive digital dance, perfectly orchestrated from beginning to the end.
Storyboarding iPad Transitions
The key to successfully designing and storyboarding the transitions is understanding and applying the seven animation principles we discussed above. To demonstrate how this can be done, let’s use a slightly more complex transition: the iTunes “opening album details”, shown in Figure 3.
Figure 3: Opening iTunes Album Details Step-by-Step
Here again, we see the seven principles at work: Component Relationship (background-foreground)
This entire transition can be viewed as bringing the selected album cover into the foreground, while the rest of the iTunes application recedes slightly into the background.
Illusion (motion perception and perceptual constancy)
The animation shows us the illusion of the album flying forward on the screen while flipping 180 degrees. The most interesting part of the illusion is the switch from darker gray “back” of the album to a while loading screen (midway through the second row). This sleigh of hand changes the focus to the white cover to make the transition believable.
Exaggeration (highlighting states, actions, and changes in state)
In this transition again, the lighting effects are used to exaggerate the switch between the background and foreground. Midway through the second row the album turns completely white against the slightly darker background.
Staging (camera view, lighting, focus)
In the beginning the iTunes application darkens gradually, and reaches its full saturation about half-way through the second row to create the background against which the album will be staged. The album, on the other hand, switches first from color to darker gray, then to solid white to jump to the foreground.
Acceleration and Deceleration (slow in and out)
The animation starts slowly, and achieves top speed half-way through the second row to quickly switch from the dark gray flipping rectangle to a solid white loading page. Just as in the previously discussed “opening iTunes” transition, this transition also slows down in the last row to smoothly fade in the iTunes album cover content elements.
Metaphors (using real-world analogies to convey complex digital events)
This transition invokes the magical feeling of opening picking the old LP album off the shelf and flipping it over to see the back cover by creating the illusion of the album jumping off the page and flipping 180 degrees horizontally around the middle.
Simplicity (avoiding noise)
While a bit more complex than the “opening iTunes application”, this transition can nevertheless be adequately described by looking at only 12 screenshots.
Once the transition design principles are understood, the process of drawing the storyboard becomes fairly straightforward. I use the same method that Galileo Galilei used four centuries ago when he first diagrammed the step-by-step movement of the sun spots in 1613.5 The basic transition storyboard for the “Opening iTunes Details” transition is shown in Figure 4.
As you try your own hand in transition storyboarding, here are a few points to keep in mind:
Use appropriate materials
To diagram transitions, I prefer to use medium-size post-it notes that measure 3-inch square. I draw each of the steps in the transition using a soft retractable pencil with a good eraser. This allows me to quickly diagram portrait and landscape transitions, and everything in between. Because the iPad is a rectangle, not a square, I leave the extra space left on the right of the post-it note (on the bottom for landscape) to write the additional explanation for each step or simply leave it blank.
As I said above, on the iPad the lighting is foundation to expressing Component Relationship, Exaggeration, and Staging principles, so it makes sense to take a disciplined approach to drawing various shades of light and dark in your storyboard. I find that the easiest approach is to draw shading on top of the picture as light lines at a 45 degree angle. As you can see in the last three post-its, I use tighter line spacing to indicate progressively darker shading.
Get the basics down first
When I first approach the transition design, I make only the post-its necessary to convey the overall movement of the various elements and basic component relationship. I sketch quickly using very rough strokes, and use a ruler and templates whenever possible to make my job easier.
Stick to 6-8 post-its
As you can see in Figure 4, it is not necessary to draw all 12 original key frames we saw in figure 3. To convey the basic structure of a transition, I typically try to use only 6-8 post-it notes. Using fewer steps keeps me focused on the principle of simplicity: if it takes me more than 8 post-it notes to describe the transition, it is probably too complex and I immediately look for unnecessary elements or animation that needs to be eliminated or scaled back.
Ignore Acceleration and Deceleration
Above we spoke at length about the Acceleration and Deceleration principle. This idea is essential to creating effective, believable transitions. However, when drawing a rough storyboard of 6-8 post-its, this is the one principle that I found can be safely ignored. Once people understand this principle, most folks can extrapolate from your rough drawings to imagine the complete smooth transition “in their mind’s eye”. As long as you make it clear to your team that this is only a rough storyboard that the end result will in fact follow the principle, you can safely ignore the subject and concentrate on the relationship, movement and shading of screen elements.
Draw the complete story
Transitions do not happen in isolation – they are an integral part of the overall customer experience. Thus, when I storyboard transitions, I typically do it in the context of the entire use case. This helps me make sure that the particular transition makes sense in the complete context and in combination with other transitions. For example, when I use the “flip” transition to show the search results on the map, and then use the “slide back” transition to go back to the list of search results, the storyboard will quickly reveal the inconsistency in the mental model of the interface I am trying to create and the problem transition will feel awkward when walking through the use entire storyboard.
Sketch a few different transition designs
When I approach a given transition, I usually try out 3-4 different design approaches to see which transition creates the effect I am seeking. Sometimes I find that I need to create 10 or more sets of ideas for more complex and critical interactions. The point of this initial sketching is not to create the complete and final blueprint, but to help you visualize how a given transition design option would feel with the rest of the app interactions. Doing the transition with post-it notes allows me to quickly add a new transition or re-position the existing post-its to create and try out several different scenarios, often while engaging in the active team discussion. I recommend you make copies or take photos of your boards periodically to preserve promising design directions before repositioning the post-it notes and changing the transition layout again.
Obtain Initial Stakeholder Approval
In addition to helping you find the best design approach, a rough storyboard is also a fantastic tool for conveying various design options to your team for joint discussion and brainstorming as well as for obtaining initial stakeholder buy-in. It’s a lot easier to discuss the merits of a particular transition movement and information architecture when everyone is quite literally on the same page looking at your complete use case storyboard.
Creating the Final Transition Blueprint
When you obtain the initial stakeholder approval using your rough storyboard drawing, you will need to document the final storyboard design that the engineers to actually create. Here you have a couple of options.
One approach is to use Flash to create the transition with the final high-fidelity look and feel. This is certainly a valid option. However, I found Flash to be more useful for higher-fidelity usability testing and final stakeholder approval than for describing transitions to engineers. Here is why: most developers do not read Flash code and most transitions are simply too fast for the eye to understand in detail the subtleties of acceleration and shading simply by looking at a running a Flash file. I have had several instances of getting not exactly what I specified or else getting something completely different, only to have the engineers claim that “this is exactly what the Flash looked like”. This is especially a big problem with distributed multi-lingual teams where communication is an issue.
The method that I found to work well is to specify (e.g. create a wireframe for) each of the frames at regular intervals of every 50-100 milliseconds for the entire duration of the transition. Most transitions are between 0.5 – 1.2 seconds, so you will need to create anywhere between 5-24 wireframes in your favorite wireframing tool such as Fireworks, OmniGraffle or Visio. Stringing these frames together in document pages will create a short movie that will comprise the complete blueprint that will describe the position, shading, and movement of each element that will communicate clearly and exactly so the engineers can create the exact transition you envisioned.
While this seems at first like a lot of work, after a bit of practice the wireframing goes fairly quickly, as the difference between the each new page and the one before it is only a slight change in position and shading. As long as we firmly keep in mind the principles by which iPad transitions work, we can easily diagram relevant steps for rich, expressive transitions.
To continue this conversation, add a comment below or reach out to Greg at “@designcaffeine”:http://twitter.com/designcaffeine or through his website, “DesignCaffeine.com”:http://www.DesignCaffeine.com.
Interested in more UX sketching techniques? Join us Saturday, May 28th, 2011 at UX SketchCamp [“SketchCamp.com”:http://www.sketchcamp.com or “@sketchcamp”:http://twitter.com/sketchcamp on Twitter] in San Francisco for a chance to learn from the experts, practice UX sketching and share what you know with others.
1. “Wireframing Marathon Starts”:http://ciohappyhour.com/wireframing-marathon-starts/; CIO Happy Hour, September 2010
2. See my article “Designing Mobile Search: Turning Limitations into Opportunities”:http://www.uxmatters.com/mt/archives/2010/03/designing-mobile-search-turning-limitations-into-opportunities.php; UX Matters, March 2010.
3. Jonathan Follett; “Interfaces That Flow: Transitions as Design Elements”:http://www.uxmatters.com/mt/archives/2007/04/interfaces-that-flow-transitions-as-design-elements.php; UX Matters, July 2007
4. Chet Haase and Romain Guy; “Animation Rules!”:http://www.parleys.com/#st=5&sl=1&id=1578; Devoxx ’09
5. Galileo documented the movement of the sun spots in his triumphant “Istoria e Dimostrazioni Intorno Alle Macchie Solari e Loro Accidenti Rome”:http://physics.ship.edu/~mrc/pfs/110/inside_out/vu1/Galileo/Things/g_sunspots.html (History and Demonstrations Concerning Sunspots and their Properties); 1613.