A Truly Ambitious Product Idea: Making Stuff for People

Written by: Dave Feldman

When I was eleven, my parents bought a Mac Plus. It had a tiny monochrome screen, a floppy drive, and 1MB of memory. And it came with something called HyperCard. HyperCard let you make stuff. It had documents called stacks, each a series of cards – similar to PowerPoint today. In addition to graphics and text, you could create buttons and tell them what to do – flip to another card, show or hide an object, and so forth.

Down at the bottom of the screen was a little window where you could type simple English-like commands – things like go to card 2 or beep. Once you’d mastered those, you could add them to your buttons or trigger them at certain times, creating real interactivity. Pretty soon I was making little games and utilities. It was the coolest thing ever.

HyperCard's Home Stack
HyperCard’s Home Stack: Pure Nostalgia

HyperCard came with something called the home stack that opened when you first launched it. I looked at it and thought, This isn’t very useful. It shows up all the time but it doesn’t do much. So I made a better one. It included various utilities, and of course a rock-paper-scissors game. I made packaging and convinced the local Mac store to sell it for $7.

It sold two copies.

Since then I’ve worked on products with more than twice as many users, but the story remains the same. This isn’t very useful. This doesn’t serve people’s needs. Let’s make a better one.

In college I discovered a career for what I did: user interface design. And though the title has changed over the years – user experience designer, interaction designer, product manager, product designer, founder – the motivation hasn’t. Technology is confusing and doesn’t meet people’s needs. I want to fix that.

Eat Your Vegetables

These days, it’s fashionable to talk about audacious ideas. Paradoxically, it’s also popular to focus on ideas that can be built in a month.

In a post last year, Paul Graham listed Frighteningly Ambitious Startup Ideas and spawned a bumper crop of companies (though my favorites, Bring Back Moore’s Law and The Next Steve Jobs, don’t seem to have much traction). Wired’s cover story for February was 7 Massive Ideas That Can Change the World.

But I can’t help thinking we’ve skipped our vegetables and gone straight to dessert. We are insinuating ourselves into more and more of people’s lives, yet we haven’t managed to meet their needs in predictable, understandable, let alone enjoyable ways.

I watch people using their devices and I cringe. They get their single-click and double-click mixed up. They open an email attachment, update it, and then can’t understand why their changes aren’t in Documents. They try to set up iCloud and end up creating three Apple IDs. They miss out on all the useful things technology can do for them, lost in a sea of complexity, confusion, and techie-centric functionality. These things were supposed to be labor-saving devices, right?

Make no mistake: This is our fault. To begin with, we’ve created ever-more-inconsistent expectations over time. Consider single- vs. double-click. Easy, right? You single-click to select, double-click to open. Unless it’s a webpage. Or Apple’s column view, where selecting and opening are the same thing so it doesn’t matter. Well, for folders; for documents, it matters.

Anyway, it’s really easy to tell if you’re in a webpage or not so you know which convention to use. Just look at the top of the screen, on the left. It should say Firefox, or Safari, or Chrome. Oh wait, you’re on Windows. Look at the top of the window. No, the frontmost window. See, it has a bigger shadow than the others. Oh wait, you’re on Windows 8? Well, are you in Metro or not? Oh wait, they don’t call it Metro anymore. I forget what they call it. Do you see a lot of colorful flat boxes? What were you trying to do again? Hey, where are you going?

You may think I’m overcomplicating things for effect. I’m not. It seems simple to you because all that stuff is already in your head. When you switch from GMail in a browser, to Outlook on Windows, to Mail.app on Mac, you know which conventions change. You have what designers call a mental model, rooted in years of experience and history, that allows you to make the right call. Most people don’t – nor should we have to.

And these interaction details are the tip of the iceberg. We do a disappointing job of understanding what people outside our bubble are trying to accomplish. Let’s be honest: We mostly make products for ourselves. Later, when they’re successful, we start wondering how people use them. We do user studies and surveys and ethnographies and then ignore the results because it’d be expensive to fix and besides, they’ll figure it out, right? I mean, we did. We lack the comprehensive understanding we’d need to make real, substantive change, to make products that are both usable and useful.

Downward Arrow

Therapists sometimes use the downward arrow technique with their clients. It starts with the apparent problem and proceeds through a series of “why” questions to the underlying issue:

Client: “I get nervous speaking in class.”

Therapist: “Why do you get so nervous?”

Client: “I’m worried that I might say something stupid.”

Therapist: “And if you did?”

Client: “I would be so embarrassed!”

Therapist: “Why? What would be so bad about it?”

Client: “It would mean I’m not good enough.”

And so forth.

Product design requires a similar process: start with a design or feature question and dig down until you find the assumptions that underlie it:

Me: Why do you ask for a user’s password every time he downloads a free app?

Imaginary Apple Guy: For security.

Me: What do you mean by security?

IAG: Well, if someone gets hold of your phone, they’d be able to install apps without your permission.

Me: And what would be so bad about that?

IAG: The apps could do malicious things with your phone.

Me: But doesn’t Apple sandbox apps and review them for malicious behavior?

IAG: Sure, but a maliciously-installed app could connect to your Facebook account.

Me: And is the risk of that happening when your phone is stolen worth requiring a password for every install?

Note that the point isn’t to make me look smart, or simply to reveal flaws. By the end of that (fictitious) exchange, we’ve gone from an ill-defined concept (“security”) to a specific question that deals in user needs.

The Product Mantra

To answer such questions we need the fundamental, defining goals of our product. Who is it for? What purpose does it serve? It’s impossible to evaluate trade-offs otherwise.

When I was at AOL our illustrious head of Consumer Experience, Matte Scheinker, introduced the notion of a product mantra: a clear, concise description of your product. Critically, it must be specific enough to disagree with.

Using my own to-do app, Stky, as an example:

  • Stky
    Stky

    Mantra A: Stky is a to-do app for naturally disorganized people. It keeps overload in check by having you reprioritize each day’s tasks anew.

  • Mantra B: Stky is a productivity app anyone can use. Unlike its competitors it keeps you in control of your tasks and on top of your life.

Both mantras are accurate. But only Mantra A is specific enough to disagree with. Do disorganized people need a to-do app? Is daily reprioritization too much work, especially for such people?

Mantra B could describe nearly anything.

Now, suppose I’m deciding whether to add a new feature to Stky: multiple sticky notes. You could have your Work sticky, your Home sticky, maybe a Stuff to Read sticky, and the like. Seems useful, and certainly I’ve had users request it. Let’s hold it up to our mantras:

  • Using Mantra A: Do we want to add additional management overhead to an app for disorganized people? Probably not. And if the sticky represents our daily list of priorities, doesn’t adding multiple stickies break the whole paradigm? Probably. So maybe it’s not a good idea.
  • Using Mantra B: Well, multiple stickies means more control, right? And lots of people want it, and we want a product anyone will use. So I guess it’s a good idea…along with nearly any other idea.

Even better, this exercise almost forces us back into downward arrow. Why do users want multiple stickies? What are they trying to accomplish? Is that deeper goal consistent with our mantra? If so, is there another feature that would meet their need in a way that fits the product better?

Asking why and writing a mantra won’t magically give us insight into our users. But it will force us to form hypotheses, which can be tested against evidence in the world around us.

And the constraints we create via those hypotheses allow us to make choices. Because the great products, the ones we revere, are invariably the work of product teams brave enough to make choices. We marvel at Apple’s clean, usable design. We call it simplicity but it’s not that: It’s knowing what to keep and what to leave out and having the guts to disappoint some of the users all of the time and all of the stakeholders some of the time. Many of us already know that, but we can’t bring ourselves to choose when push comes to shove.

None of this is a substitute for user research. We still need usability tests, ethnographies, brainstorming sessions, click data, bucket tests, discovery, and all the rest. But in the absence of clear hypotheses and specific questions, user research is a little like the proverbial tree falling in the forest. Research tests our assumptions and tells us where we’re right or wrong; it doesn’t tell us what to build.

This isn’t the kind of audacious problem we solve all at once…nor do we have to. Every product that actually makes someone’s life better is a piece of the solution – not just for the life it improves, but for the designer who’s inspired by it, the team that decides to one-up it.

Make no mistake: This is hard stuff. It requires tenacity, and bravery, and empathy. It requires observing how people live their lives, and then handing them products that aren’t at all what they asked for. It needs more user-centered ways of doing bug triage and structuring development workflow. But as technology becomes everyone’s ever-more-constant companion I can think of no greater or more worthy challenge.

When I renamed my blog last year, I created a tagline: “We make stuff, for people.” It was meant to be funny, sure, but also to encapsulate everything I’ve said here. Technology is meaningless without people; yet, as technologists, we’re prone to forgetting that. We end up debating strange, empty questions. Does the world really need another photo sharing service? Is skeumorphic design good or bad? Is Ruby better than Python? None of it matters on its own.

It’s important to make stuff. But it only matters if we make stuff, for people.

Five Big UX Topics in 2012

Written by: Catalyst Group

For years, UX professionals have vigorously lobbied for a “seat at the table” when it comes to formative decisions about products and product development. Looking back at 2012, trends indicate that this wish is becoming reality. Many leading UX consultants reported that their clients are more open to research and design methods with a UX focus than they have been in the past.

This elevated focus on UX ideas and concepts will require informed engagement with several high-level topics that emerged in 2012. This article discusses five of the themes that we expect will have relevance into 2013.

1. UX in the C-Suite

2012 was the year that UX Design crashed the C-Suite. The appointment of Marissa Mayer as Yahoo’s CEO, and the emphasis on her background in user experience as a prime reason for this appointment, is one of several examples of increased awareness, at the highest levels of corporate management, of the importance of user experience to the success of a company’s products or services. These developments suggest a growing understanding that UX is not merely a tactical part of a product development process mainly driven by inward-facing operational strategies or broad, quantitative market analysis. Increasingly, successful companies maintain a sharp focus on the experience, broadly construed, of their end consumer from the earliest formation of product strategies through their design, development, and launch.

Another indication that language and ideas around user experience are gaining traction is the surprising appearance of skeumorphism (basically an interface that mimics a similarly functioning analog item) in the mainstream media. The New York Times discussed the concept in the context of a management shake-up at Apple that may herald a departure from the “real world” design style preferred by Steve Jobs (but apparently disliked by many design authorities inside and outside of Apple) towards a minimalist aesthetic that is consistent with Apple’s hardware design. Whatever software design direction Apple ultimately selects, the popular discussion of the business impact of this decision is likely to inspire other corporate leaders to ask themselves how an improved user experience could help them achieve their business goals.

2. Multi-Context UX Goes Mainstream

Lines that could once be neatly drawn between mobile and desktop experiences continued to blur in 2012. With more and more people accessing content from multitude of devices — often at the same time — people have come to expect a seamless and consistent experience from site to site and from device to device. A variety of techniques, technologies, and philosophies are at the UX designer’s disposal at a time that some consider the “post-desktop era”.

With mobile browsing continuing to grow in the United States and worldwide, it makes sense for designers to focus on getting the mobile experience right. “Mobile first” encourages designers to start with a small screen, focusing on key interactions and content prioritization. “Mobile first” may not always be the right approach, but by starting on a small screen, solutions to complex design problems can often be revealed.

“Progressive enhancement” is not a new idea, but has gained momentum lately thanks to responsive design, HTML 5, and CSS3. Progressive enhancement starts with a useful experience as a foundation, adding more complex functionality only if the browser or device can support it. Similar to “mobile first”, progressive enhancement ensures that the base experience is solid before adding in additional bells and whistles.

It’s been two years since Ethan Marcotte coined the term “responsive design”, and in that time we’ve seen more websites that adapt to a user’s screen size, orientation, and device. We’ve seen some excellent designs, including Smashing Magazine, and the newly launched Mashable. Responsive design does not always make sense, and a responsive website cannot necessarily replace a native application. As responsive design matures, we will see more innovative techniques and best practices.

It is still difficult to predict a person’s context. We may know what kind of device or screen size they are using, but we won’t know where they are. We should be prepared for them to switch back and forth between devices frequently, and know that they expect seamless transitions and a consistent experience across all channels. And people don’t distinguish between disciplines – the difference between UX designer and visual designer may be important to us in the industry, but in the end we are all contributing to the user experience.

3. Increasing Importance of UX in Product and Service Development

The growing importance of user experience design was evident in industries like health care. We believe that this area will continue to boom in 2013 and beyond.

More practitioners and UX-focused agencies are starting to specialize in healthcare research and design. Companies in the health vertical are increasingly seeking out specialized UX help. Rising stars like Patients Like Me, Massive Health and ZocDoc are revolutionizing myriad aspects of the health care experience. Each of these organizations is strongly rooted in UX culture. ZocDoc even won an Interaction Award for 2013. Conferences like Healthcare Experience Design continue to flourish and books are beginning to appear. Look for Peter Jones’ Design for Care which should appear sometime in 2013.

4. Agile UX Design and the Lean Startup Movement

The evolving relationship between UX design and the Agile and the Lean Startup movements was another important trend in 2012. UX practitioners continued to grapple with their role in Agile development processes.

Early this year, Jared Spool wrote:

“Agile development is no longer a fad–it’s the way people are getting software delivered… Our old methods no longer suffice, as they are too bulky and slow for the demands of the Agile process. Instead we need to come at our work with a renewed introspection of everything we do.”

UX practitioners have heeded the call: the NYC Agile Experience Design meetup group has more than 1,500 members with nearly half to them joining in 2012. San Francisco’s Lean UX group has about 1,600 members.

A related movement, which is similarly UX friendly, is the Lean Startup “school” that recommends a “build, measure, learn” feedback loop. The Lean Startup principles boldly declare that a Lean Startup should develop a minimum viable product (MVP) to begin the process of learning as quickly as possible.”

Designing for MVP is a radical departure from the perfectionism we have seen in traditional UX design. Doing UX in Agile fashion means backing off on documentation and traditional processes to become more iterative and nimble. It means having the courage to let something go when the design is “good enough,” knowing it can be adjusted in future iterations.

Many UX designers have seen positive results when integrating UX design into Agile development. The focus on results over processes and documentation is invigorating. Adjustments include: writing user stories; becoming less sentimental about deliverables; attending standups; and participating in testing, implementation, design and strategy at the same time. While the relationship between Agile and UX is uneasy at times, the consensus in 2012 was that the combination is a win for users and product development organizations.

5. Visualizing Data, Big and Small

Although the collection and interpretation of data isn’t a new trend, 2012 brought a new creativity and aptitude for mining data sets of varying sizes to expose patterns and develop insights – and a variety of compelling examples of how the meaning of data patterns can be visualized. Most notably, the strategic and predictive value of data analysis was a potent theme in the runup to, and aftermath of, the 2012 presidential election. Obama relied on “data-driven decision making” in fundraising and campaign strategy planning and Nate Silver famously used aggregated polling data to predict the election outcome with striking detail and accuracy.

In parallel with this new understanding of what we can learn from data, figuring out how to present it in creative and understandable ways has become a high priority. The New York Times, in particular, published a host of compelling data visualizations and information graphics around the election (the NYT graphics department tweets their work here) that illuminated election issues with greater clarity.

Mark Newman of the University of Michigan provides a great example from the election cycle. The standard red-state / blue-state map, the “traditional” visualization of election data depicts a deeply divided country and stark correlations between geography and political persuasion.

statemap1024

However, adjusting the map to account for population sizes, a finer grain (counties rather than states), and a gradient to depict party lines, the story is more nuanced.

countycartpurple1024

Like most buzzwords, “Big Data” is casually invoked in disparate situations, and the difference between what constitutes big and small data is fuzzy and probably not that relevant. But what is meaningful from a UX design perspective about this discussion is the correlation of data patterns with underlying human behavior – and the strategic value of the data in predicting future behavior. In 2013, expect to see a greater emphasis on the collection and analysis of data, big and small, as a way to improve user experience.

Conclusion

Now that the UX “seat at the table” has opened up, we’ll need to be prepared to discuss the developments and business impact of our discipline at a strategic level. Some major themes emerged in 2012, and we look forward to following and participating in the development of these themes into 2013 and beyond.

A Perfection of Means and a Confusion of Aims

Written by: Abby Covert

“A perfection of means, and confusion of aims, seems to be our main problem” – Albert Einstein

My work involves helping people to understand how to best plan circumstances in which users are engaged and satisfied with their experience. Yet, I do not call myself a user experience designer.

I am an information architect. I work on clarifying information and the structure it should take to best enable understanding. I create maps, controlled vocabularies, diagrams, flows, hierarchies, and statements of truth to facilitate groups towards a goal. I do my own research. I use interviewing, contextual inquiry, and usability testing most often.

  • I am not an interaction designer. I do not explore, define, and refine the interactions that a user has with an interface and/or service.
  • I do not code, or render what a user will actually “see” through visual design.
  • I am not a content strategist. I do not extend structures and derive templates in order to propose governance and process flow for the creation of new information and the retiring of old.
  • I have done all of the above at one time or another.

I don’t think it is worth arguing about whether you can or should have a job in which you do all of these things. But I fear that the widespread adoption of “User Experience” has had adverse effects on the clarity of our process. It has made concepts like information architecture, content strategy and interaction design harder to explain, to teach and ultimately to learn about. In my humble opinion this umbrella is obscuring others’ view of our reality.

Our hindsight is clear, but our foresight is clouded

I am afraid that there is a shortage of specialist jobs, and it isn’t because those specialities aren’t needed. I believe it is because the value of those specialities, and the impact of not considering each carefully, is in too many cases not clearly called out to our clients and partners.

A simple test of this is asking, “If a UX fails, which part is to blame?” Is it a problem with the information architecture? The interaction design? Or maybe the content strategy? Maybe it is indeed all three or none of those three. Maybe it was badly produced or written? Maybe it has technical issues? Maybe the branding is off or the marketing didn’t drive the right people to do the right thing?

In picking apart an experience, the differentiation of terms suddenly offers tremendous value of focus. In focusing on a specialty we don’t need to throw the baby out with the bathwater. We suddenly have lots of dials to play with in formulation of a strategy for improvement.

Our process is being reduced

In my experience when “UX” is the term sold-in, the resulting project plans are less likely to reflect the points at which various specialities will be relied upon to progress the team. Often prescribing a stacked to the gills list of tasks reduced to the nebulous “Design the User Experience” on the Gantt chart. The makers of these types of plans leave it to “UX Designers” to divide the time they have amongst the various specialities of a “UX” and arrange their time against it.

If you have a great generalist who is also a great salesperson, this model can work well. But more often I fear that we are putting our industry in a bad position by generalizing when communicating about these specialities with others. I hear designers say “I’m doing the UX” far too often when describing the value that they bring or the part of the process they are in.

The worst case scenarios result in teams jumping right to wireframes, prototypes and documentation. I see far too many UX designers that have become wireframe machines.

This approach is directly contrary to the truth of how things get made properly:

1. You must define the why before the what.
2. You must define the what before the how.

In other words, defining a solution before you understand the goal and prioritized requirements is often a wasted effort and a distraction. Whether you define everything on your own and work through the various specialities required is really not my point at all, my point is that these questions and these specialities are always needed and in some cases they are answered by different people.

Our specialists are struggling

I work primarily on large scale, systems-based projects. I am good at the defining the Why and the What. But when it comes to defining the How, I prefer to work with others more dedicated to that craft. Sometimes the user experience designers I work with struggle to understand and champion the value of information architecture. Sometimes they feel like I am taking away their fun. But in moments when they need the information architecture to be clearer, they are able to demarcate it clearly and ask me for what they need. After a few of these moments the process gets easier for all of us and consensus is more easily reached.

Why would I ever have to defend the value of IA on a team of like minded umbrella dwellers? Why do I see important steps skipped in favor of moving right to defining the How so often? I don’t think it is because I am working with the wrong people. I think it is because our industry has a long history of land grabbing of titles, processes, deliverables, and value.

Our road has been paved over by many, and driven over by many more

In the past year I have been told to change my title to product manager, design thinker, strategist, service designer, interaction designer and of course user experience designer.

The convergent nature of this industry has made our road a hard one to name, and I respect that deeply. But I don’t think the right strategy is giving up on naming it, stealing a name, or settling for a name that doesn’t quite fit.

I think this is about hunkering down and creating consensus. We need to define ourselves and our value. We need to learn to sell ourselves and the expertise of others under this umbrella. We need to remember what it is like to not understand these concepts. We need to create ways of explaining what we do that make sense outside of our silo. Lastly and maybe most importantly for our own sustainability as a field – we need to give permission to generalists to specialize.

These are important steps forward as we continue to become a legitimate field of practice. Our students, clients, heroes, and our peers deserve these levels of truth.

Our future is bright

Regardless of what you call what you do, it is a great time to be alive and contributing to this time of our industry. My greatest hope is that this is still fun to talk about when it is all said and done.

I am an information architect. One day I may be something else. For now, I see a need, and I want to keep on filling it.

The Music Outlives the Band

Written by: Robert Hoekman Jr

Parental advisory for strong language, guru deflating and semantics.   

A couple of years ago, I was asked to speak about “design thinking” at a web conference. The conference-speaking part was nothing new, but the topic certainly was. With the “design thinking” wave having just recently peaked, I had yet to even come up with a clear definition of the term. So I accepted the challenge and went about the business of putting a wrapper around the idea so I could map it to our work as strategists and designers.

What I found was a bit of a joke. The few snake-like definitions I was able to charm out of the depths of the interweb with magical flute-playing were no better, no worse, and no different than definitions of “interaction design,” which were in turn no different than definitions of “problem solving,” which we as a species have been doing since the dawn of humanity. So what was the big deal about design thinking? Well, the big deal was that some designer douchebag decided one day to rebrand “user experience,” presumably to bring his agency a few new dollars. Leader of IDEO or not (I’m talking to you, Tim), rebranding a profession for no good reason is not a noble nod to semantic precision, but an exercise in self-importance.

Besides this, it bothered me that those among us who spend our time fighting the good design fights were acting like we had nothing better to do, as if the world would solve its own problems while we were over in the corner deciding what to call our particular brand of ice cream. And it was at this point that I decided I no longer gave a fuck.

It doesn’t matter one bit what you name the band, it matters how good the music is. The reputation you build around the name will outlive even the stupidest, most drunkenly attempts at a moniker cool enough to guarantee future rock god glory.

And most importantly, while we were all busy debating syllables and word pairs, the world at large caught onto the moniker we’ve been using all along. On a near-constant basis these days, the term “user experience” is used by people whose expertise is in raising kids, or selling insurance, or milling sugar. It’s used by people who have no business even knowing what “user experience” means. It appears in dinner table conversations. It appears in write-ups about apps, devices, and gadgets galore. It’s in magazines, on television, and online.

“User experience,” as a term, is weak, ineffective, and inaccurate. But although I am among the many in our profession who believe this sad title we’ve assigned ourselves becomes less potent with each utterance, I happen to also believe we should guard it with our proverbial lives. “User experience,” like it or not, has become a household name. And the best chance any of us has at legitimizing a profession that invariably begs further explanation and qualification is to make it as easily recognized and banal as “carpenter” and “motorcycle mechanic.”

“User experience” either is or isn’t the best term to serve as the concrete beneath our careers. And we either give a fuck or we don’t. It’s our choice.

Let’s stop talking about what it’s called and start solidifying the world’s understanding of it. Some people build cabinets. Some fix motorcycles. We design sites and apps. It doesn’t matter how we do it. It matters how easy it is to accept that it’s real, it matters, and is a sound career path to describe when you meet your girlfriend’s parents over Thanksgiving dinner.

Fuck title debates. “User experience” has momentum. Let it roll, and get back to work.

What is User Experience?

Written by: Stephen P Anderson

I’m tired of discussing “user experience.” What it is, what it isn’t. I’m tired of talking about wireframes vs prototypes. I’m tired of the agile-lean-waterfall debates. I’m weary from discussing personas and sitemaps. I’m wary of design patterns, and I’m pretty sure the term “user” has held us back. Above all, I’m tired of defining the damn thing. None of this is what I practice, and these things aren’t user experience.

So what is UX?

Let’s talk instead about two kinds of people:

At the end of the day, there are those people who will go quietly about their jobs, perhaps grumbling about not having a “seat at the table.” These people may have also been taught the right way to do things. Then, there are others who–regardless of their titles or position–will stand up and say, “Wait a minute, why are we doing it this way?”

What sets these folks apart is a relentless curiosity.

They are the people who ask all the “what if?” and “why not?” questions. They disrupt processes when the process isn’t paying off. And they defy decisions, when the decisions don’t make sense. They may be subversive, but their goal isn’t subversion. Rather, they care. About the experience being designed for, and the people who will have to live with these experiences.

They care because of another vital quality: Empathy. They care about people. It’s not all about the paycheck for them. They care about what is created, because people will be affected, influenced, hindered or helped. They care about the business, people served by that business, and ultimately the world. They are curious, have empathy, and are vocal, which puts them in an interesting position.

Regardless of their appointed position, those who care will be found at the center of all that is designed, built, served, or otherwise experienced by people. These people will care about load times in browsers as well as long lines at the grocery store. They care about the details that make or break an experience: remembering a name, when someone on hold is transferred to another representative, or fixing the out of place pixel because, well, it matters.

They care about solving the real problems. And to be clear, they care about the business that enables these experiences. No customer=no business (and vice versa). Because they care about all of these details, they don’t fit neatly into an existing business unit. And they don’t even want their own business unit. They’d rather cut across silos, and grin at the magic that happens when people collaborate across disciplines.

These people have been around, long before the term “UX.” And these people will still be around, long after UX has either died out or hardened into a certifiable profession.

I am one of these people. I design experiences. Or I design for experiences, if we must mince words. I don’t do this because I was trained to do so. I do this because I must. I am a User Experience Designer. Or whatever they’re calling it these days.