Leaving the Autoroute

Written by: Christina Wodtke

I recently had the pleasure of traveling across France via autoroute. In the past, my husband and I had taken all backroads for our adventures, but on this trip we need to get from one in-law to the next in a day, and the autoroute was the ticket. The vast expanses of French countryside are gorgeous and remarkably varied—rolling hills and grassy fields becoming bluffs and cliffs; vineyards become cornfields then become sunflower fields; all punctuated by signs proclaiming the next town. The signs caught my eye. Unlike America, where a sign just has the town name, here each name was accompanied by an illustration of the things for which the town was famous: one town is famous for mustard, one town for knives, one for nougat, one for a type of melon… the first time I saw this I laughed. The idea of a town devoting itself to nougat seemed a bit absurd. But specialization has power. The nougat of Montelimar can be found all over France and is known to be the best. Laguiole is recognized as making fine knives not only in France, but around the world. Everyone knows the mustard from the city of Dijon. By committing all their attention to a single craft, often literally over hundreds of years, each town has received the renown that comes with great work.

But what happens when you leave the autoroute, lured by one of those signs proclaiming the town’s mastery and claim to fame? You find a town—a butcher, a baker, a pastry shop, a pharmacy. Little gray-haired ladies with their baskets heading for shops, men sitting in the café with a glass of Pastis or playing Petanque in the park. Mothers shopping, pushing baby carriages, tourists eating in overpriced cafes with English menus, a church still frequented by worshippers as well as chubby tourists… in other words, each town has all the things a town must have to be a town. Laguiole has its share of knife-shops, but overall it is still a town and supports the inhabitants that give it life. The knife-maker has a place to eat and drink, work and worship, as well as to see friends for a drink and a game of Petanque. Moreover, as he watches the butcher cut a steak from a side of beef or a pastry chef slice apart a cake, he knows more about what a knife should be.

So, other than a chance to reminisce (ah, the oysters of Gujan-Mestras, the macaroons of St. Emilion, the cannelles of Bordeux) what does this mean for us, practitioners of the young and unrefined craft of designing digital systems? What the heck are you raving about, Wodtke? Simply that the passionate debates over specialization vs. generalization are a false dichotomy, and are not serving us. It’s not vs, but and that we should be using. Designers who know nothing of html or image optimization, usability experts preaching without even a basic knowledge of design principles, information architects and interaction designers who don’t understand each others’ skills are weakening themselves, as Laguiole would, if it closed its pharmacy for another knife shop. The health of your craft comes from a rich broad base of knowledge.

Recently a well-known usability expert discovered a clue to improving his own site from a web design list. This tip was one of the most basic pieces of design knowledge you learn when you begin to study design. Yet, this specialist didn’t know it—and moreover, it hurt the usability of product because he was not well rounded. Usability sites are notorious in the crudeness of the design, design sites for their lack of usability. Sites by engineers often miss both, while sites without an engineer’s knowledge load slowly and are buggy. It’s not enough to be a specialist—as they say, when all you have is a hammer everything looks like a nail. You have to have a broad grounding in the related fields along with a deep understanding of your area of specialization. IBM calls these folks T-shaped people, and seeks them out when hiring time rolls around.

Moreover the world beyond our craft teaches us our craft. Poetry informed my ability to be an information architect—you learn about the subtle nuances each word carries and to craft phrases to ensnare your readers’ emotions. This knowledge informs labeling choices of course, but also the more delicate arts of contextual messaging and categorization. Cooking and collecting cookbooks impart a great deal of insight into what makes instructions succeed or fail; travel has taught me to question my most basic assumptions about user behavior.

I have also cracked a few O’Reilly books and learned basic coding, I have spent time in usability labs learning from users and the researchers who can interpret what that means, I spend time at designer’s elbows asking them to explain color, line and form, I read business tracts — all have had a direct and immense effect on my skill at Information Architecture and Interaction Design. I don’t consider myself a master-craftsman, but I know that if I wish to become one, it means attending to not just my specific skillset, but to the world in which it resides.

You can’t be in expert in everything, obviously. But you can make sure you have enough knowledge to appreciate the craftspeople you work with. So designers, take “Introduction to programming” at the local college. Engineers, attend all the usability sessions and watch what those crazy users do. Usability folks, go read Robin Williams “The Non-Designers Design Book” at least.

If you dream of being an expert, read the Sunday paper cover to cover, from business section to comics page and then read a peer-review journal. Take a painting class, study yoga, cook a complicated meal. Learn from your coworkers, and learn from your friends. Specialize, but remember to be a human being as well. And someday you may be as famous as the mustard of Dijon.

Lessons to be Learned

Written by: Marla Olsen

Ivy-covered halls are filling up again with eager students of the user experience fields ready to change the world (or at least to study out the recession). But are these programs really teaching them what they need to know?

There are serious problems with the way user experience-related programs are being taught. Don’t get me wrong, I’m not against academic degrees. My father was a professor and I’ve been an instructor myself. But that experience makes me worry that current academic programs aren’t well-suited to serving the needs of their students, nor our professions. Let me count the ways…

Research vs. practice—To be fair, academia, especially at the graduate level, has a two-headed mission: to train future professionals and to advance the discipline. Unfortunately, academic culture is heavily weighted toward research at the expense of teaching. No one gets tenure for being a great teacher; getting tenure means publishing or perishing.

The problem is that no one seems to pay attention to whether the published articles are meaningful. Dr. Bob Bailey of Human Factors International estimates only 5 percent of the roughly 1,000 usability-related articles published each year have any practical, long-term value to working professionals. These are the professors who are teaching our future colleagues? And this is only one arm of the user experience collection of disciplines.

Granted, Bailey has an ax to grind because he wants you to sign up for HFI’s seminar where they present their “best of” summary of the latest research. But having read a decade’s worth of SIG-CHI papers and a couple years of ASIS&T journals, I can tell you that the amount of useful research is far too small. (And unfortunately, the good research that exists is hidden in academic jargon, a less than user-friendly format for practitioners—particularly ironic given our field.) Much of it’s been simply irrelevant. Some of it has been laughably bad, where it was obvious the researchers were venturing into territory where they hadn’t a clue—nor bothered to involve someone from that field who could’ve prevented them from making basic mistakes. Which brings us to the next problem…

Specialties vs. convergence—Too often academia fetishizes specialization. This is compounded by the departmental turfwars that seem as much a part of colleges as the ivy covering the halls. The Computer Science department doesn’t talk to the Design department, which doesn’t talk to the Library and Information Sciences department…

The problem here is that user experience is a new and convergent field, although the lines of its individual roots may run deep. It requires skill in a variety of disciplines to integrate content, presentation and functionality. In the past these were typically separate—for example, no one thought about the information architecture of a software product, or the branding implications of a categorization scheme—but first interactive multimedia and then the web caused these once-separate concerns to begin overlapping and blurring. Even if you choose to specialize as an information architect, an interaction designer, or a usability engineer, it’s essential you understand the wider context if you want to be effective.

Universities have developed innovative cross-disciplinary programs in other disciplines. For example, the University of Southern California recently overhauled it’s journalism program to require a “core curriculum” of reporting, writing, and producing for the three primary media formats—print, broadcast, and online—before students can specialize in one area, rather than the traditional educational format in which each medium is an independent track. Students aren’t realistically expected to excel in all mediums, but the program is intended to make them comfortable when asked to do something outside their normal specialty.

There is some hope in our fields. Last year the University of Baltimore created a Master’s Degree in information architecture and interaction design. Even better, the program was designed to allow students take their elective courses in four focus areas: technical, arts and culture, cognitive and ethnographic, and management and entrepreneurship. Likewise, for several years, AIGA has been trying to develop an appropriate broad-based curriculum for “experience design.” Both of these are the sort of forward thinking we need. The problem is…

Time vs. breadth—Back in March, Jess McMullin asked for help in compiling a list of what professionals in the field should know so he could talk to a local college about creating user experience certificate program. As you can imagine it was long, long list.

The problem, of course, is that having a decade-long degree program just isn’t realistic, even though that’s what it would probably take to combine classes from all the fields that are relevant and cover them in-depth. Most programs are only two years, some like Carnegie Mellon’s Master’s Degree in HCI are only a year. Carnegie Mellon is highly regarded, but how much can you really teach in a year? Other programs may be longer, but often there’s even less time devoted to relevant topics, since information architecture or interaction design is only a recent (and smaller) add-on to a larger traditional program.

There’s no good solution, so students just need to be mindful about how much they’re really learning—and not learning—in amount of time they’ve got. Focus can be a substitute for time, but many programs aren’t focused well enough, especially in a rapidly changing field, because few experienced professionals have a hand in their creation. Why, you ask?

Degrees vs. experience—Without an advanced degree—doctorate definitely preferred—it’s nearly impossible to become a tenured faculty member. Lecturers are at the bottom of the departmental hierarchy and consequently aren’t involved in setting the direction.

The problem is that those who’ve spent years in the trenches—nurturing these disciplines, building websites and software, making mistakes along the way and learning from them—just aren’t likely to go back to get an advanced degree in something they can probably teach, and arguably teach better than someone who’s only experience has been theory and research. But in academia—no degree, no tenure.

To their credit, a number of professors I’ve talked with recognize this problem. Unfortunately, it’s unlikely to be resolved without an overhaul in the way academia works so that masters of the crafts are valued as much as a master’s (or doctoral) degree.

Education vs. experience—This lack of real-world experience also has other side-effects. One professor complained to me about how a number of his usability students come to him expecting to be “saviors,” protecting users from ambiguity and other horrors.

Such, ah, enthusiasm, is the nature of students and school should provide a hothouse for them to explore ideas. But ultimately it’s healthier for students to be exposed to the cold winds of reality, in measured doses, before they graduate. The instructor of my final design class began the course by saying, “For the next 12 weeks, I’m going to act like a client and you’re not going to like it.” She was right. But I learned some my most valuable lessons then, and learned them in an environment where the worst consequence was a bad grade, not getting fired.

However, being able to impart those lessons requires experience outside the ivory tower. A professor who’s never had to work effectively in a team, who’s never had to balance competing demands, who’s never had to make the hard trade-offs to keep a project on schedule and within budget just isn’t equipped to convince students that life is a bit more complicated than theory. While I disagree with Jakob Nielsen on a number of things, I do agree with him that it often takes a decade’s experience in the field to really master a discipline. The question is how to get those who do to also be those who teach.

Business schools offer a potential model for how academics and professionals can work together. They’re far from perfect, but from what I’ve seen they take a much more balanced approach between the competing demands of the research world and the professional world. The papers may still be in academic form, but I’ve seen far higher percentage address real-world concerns. The professors still teach, but working professionals are frequently invited as guest speakers to complement theory with practice. In an even more radical departure, USC’s journalism program intentionally now relies primarily on adjunct faculty, who are working professionals, to do most of its teaching.

But unfortunately the wheels of academic bureaucracy move slowly, so don’t expect to see these sorts of changes spread through academia soon. In the meantime though, there are things we can each do:

  • If you’re a student, insist on getting practical hands-on experience in addition to classroom lessons. With the current job market, getting internships may be tough, but there are almost always departments on campus who could use help. And remember, while you’re getting an education, you’ll still be lacking experience when you graduate. So it’s wise to show a little humility. Recent grads who claim to be the “expert” only undercut their own credibility.
  • If you’re an educator, reach out to the professional community to act as guest speakers and to talk with them about what kinds of research might be useful for them. Likewise, ask professionals what skills are really necessary and make sure your students get out of the lab and into the field. Encourage students to think about the bigger picture beyond just their particular specialization.
  • If you’re a working professional take some time to start talking to academia to ensure students get the sort of education that’s going to be useful for them after they graduate. If you can, bring on interns. Volunteer to guest lecture. You’ll probably learn something yourself by dealing with students who don’t have preconceptions and/or interesting in pushing the boundaries of new ideas.

After all, we can only benefit by having practitioners who are better prepared to meet today’s challenges.

The Tool Makes the (Wo)Man

Written by: Erin Malone

The other day at work, we were planning some new processes for bringing work into the team. One team member suggested we use a product that another group was using to track our projects. The suggestion on the table essentially meant we would force fit our way of working into this tool “because we already had the tool.” This was proposed instead of doing the work to figure out how we needed to get our jobs done and then doing the due diligence to find the tool that best matched our needs.

The tool we should be cultivating here is our brain—our skill for problem solving and providing value to our clients and companies.This scene resonated with me because it is an example of not understanding the problem at hand. Jakob Nielsen’s exclamation “Flash is 99% Bad” is another example of barking up the wrong tree. He is now working with Macromedia to make the tool more usable—as if that was the source of the problem. What I can’t understand is why more people aren’t getting riled up about the fact that the problem isn’t the tool.

The SIGIA list occasionally erupts into the “Which tool do you use?” or “Which tool is best for information architecture/best for flow mapping/best for wireframing” conversations. Even Steve Krug noted this at the IA Summit in his Top Ten list of what IAs talk about. These questions arise as if the perfect tool would make the perfect IA. We lose sight in these discussions of the fact that we already have the perfect tool: our brains. The knowledge, expertise and skills to solve problems are right between our ears.

The visual manifestation of a solution—whether done in Illustrator, Omnigraffle, Visio, HTML, Flash or even on a cocktail napkin—is beside the point. If the solution is appropriate to the problem and the end user, then it doesn’t really matter how it is implemented.

But, you say, “the best, the right, the perfect tools will help us.”

“It will make us more efficient and give us more time to think, to solve problems.”

And I would say, you are right… to a degree.

Solving the problem will come from a deep understanding of the issues, of the users’ needs, of the task—from research, from analytical thinking and then sketching out solutions. Sketching these solutions can be done in any way—on a whiteboard, on paper with (gasp) a pen or pencil, or on the computer with the tool of choice.

My concern and angst over these types of discussions, as well as the kind of proclamations that Nielsen and other gurus make, is that focusing on the tool—either finding the right tool or badmouthing the perceived “wrong” tool—moves our energies away from the real problem at hand: design solutions that are inappropriately or poorly executed.

In all the talk of Flash being bad, I have never seen Nielsen and others offer to work with design schools or to help craft curricula, lessons or workshops that will teach the appropriate skills to the generation of designers who are being taught tool after tool rather than how to appropriately solve problems. So what’s my point? The tools of the trade that we use to solve our problems are mostly irrelevant. They come down to personal preference, to comfort level, to speed of learning and what others in the group are using, which is generally a concern when sharing documents. The tool we should be cultivating here is our brain—our skill for problem solving and providing value to our clients and companies.

The tools used to implement solutions (as opposed to the tools used to design solutions) also matter a little less than we’d like to think. Of course, the solutions need to be appropriate to the medium, to the end users’ needs and should solve the problem in the best way possible.

So even if Nielsen and Macromedia succeed in making rich media best practices 100 percent “good” (Macromedia press release, June 3, 2002), or even if someone comes along with the killer app for IA work, it still won’t matter much if designers and IAs don’t understand the medium or how best to solve the problem.

We have a responsibility to kick things back—to our bosses, to our clients, to our colleagues—when the recommendation to use a certain tool or technology just because it is there doesn’t fit the needs of the task, whether that task is designing a solution or implementing a solution. We have a responsibility to be smart problem-solvers and use the one tool that we all have—our brains.

(Over)simple Answers for Simple Minds

Written by: Marla Olsen

Hell hath no fury as those who’ve been attacking Jakob Nielsen on various user experience-related mailing lists in recent weeks over his decision to work with Macromedia on Flash-related usability issues after nearly two years of declaring “Flash 99 percent bad.” He’s being called a sell out, a hypocrite. The list goes on.

Part of me feels for Nielsen. After all, the pressures and temptations to provide simple answers to complex issues is one we all face in our professional practices.Even if you’re not a high-profile guru, you’re probably facing the similar pressures to come up with quick and easy answers to Big Gnarly Problems.

Let’s face it, if Nielsen had said “Flash is 50 percent bad” he wouldn’t be getting beaten up as much as he is—but no one would’ve paid as much attention to his original article either. I’ve spoken at a number of conferences and the more provocative you are the more audiences tend to listen. Plus being a pundit can cause a subtle feedback-loop that causes you to think your way is the best way for everyone—being known for a particular approach attracts clients with agree with your view and drives away others whose problems don’t fit your approach.

But even if you’re not a high-profile guru, you’re probably facing the similar pressures to come up with quick and easy answers to Big Gnarly Problems.TM

If you’re a consultant, you have to position yourself as “the one with the answers” if you want to be successful. After all, clients are coming to you because they’ve got a problem that can’t solve themselves and they want someone who can. From a client’s point of view, as a wise old consultant once pointed out, from the client’s point of view, they’re generally not paying you by the solution, they’re paying you by the hour. Somebody who says, “it depends,” is liable to eat up the budget without providing answers—at least that’s the fear.

If you’re an in-house professional, there’s a different but similar dynamic. To get that bonus, you want to show that you are knowledgeable, that you can get things done, that you’re the expert. This is often reinforced by management culture that places a premium on decisive (if not always thoughtful) action. Compounding this issue is that many of our positions are often relatively new at companies, so there’s a need to prove ourselves—or at least that’s our perception.

Another factor is the relative inexperience of many practitioners. Back in January 2001, a survey by the Argus Center for Information Architecture found that two-thirds of respondents had less than two years experience. (Unfortunately, the survey didn’t ask about prior experience in other fields.) More than a year later, the majority of these people are still apprentices—at best journeymen—since it often takes a decade to truly master a field.

Even if you’ve got a background in another field, you still may be trying to master new areas. While the roots of the issues we deal with are familiar to various fields, the web is different. It mixes content, behavior and presentation in new and deeper ways. Library scientists never worried much about the branding of things they organized. Software and usability engineers rarely dealt with figuring out how to navigate large amounts of content. Graphic design traditionally offered little guidance for moving beyond communication to dealing with functionality aspects of interaction.

Consequently, even if you are experienced, this convergence often poses new types of problems, leaving you swimming in uncharted waters where it’s comforting to grab tightly to the lifebuoy of an expert’s pronouncements. They’ve done the thinking for you, all you need to do is repeat it. Unfortunately, even if the expert’s thinking does deal with complex issues in a sophisticated way, the disciples rarely match the nuances of the master. Which makes it all the more unfortunate that too many gurus today are promoting oversimplified ideas to begin with.

Another problem is that simple-minded answers are often popular with clients and bosses looking for easy solutions. Witness the cycle of business fads among managers and management consultants. As you move up the corporate ladder, attention spans do seem to get shorter and shorter, forcing you to talk about complex and difficult issues in bullet-points.

All of these conspire to lure us into making simple-minded and absolutist pronouncements of The Right Way To Do Things. Particularly since our role in an organization is often new—either being brought into a company to consult, or holding a relatively new title—we often need to make ourselves heard. Consequently, much like pundits at conferences, we resort to provocative oversimplifications to grab people’s attention.

While that may be effective in the short-term, it’s self-destructive in the long-term, both for individuals and our professions as a whole. Witness the widespread cynicism toward management consultants because of each new One True Way tosses aside the last One True Way. Do we really want to end up in a similar position?

Statements like the web will reach 90 percent compliance with a particular set of usability guidelines by 2017 are so ill-conceived that they’re bound to blow up in our faces when clients and colleagues take a closer look. Absolutist statements that the web is—and damn-well should be—“just a library” or “just an application” or “just cool design” reveal a narrow-mindedness that entirely misses the multi-dimensional nature of the convergent field we’re working in. Again, when those we deal with realize this, who’s going to look like the expert?

So what is the alternative? For a while it was fashionable to declare “it depends.” And yes it does. But there are a couple problems with using this as a universal answer.

We’re hired—by clients or by bosses—to solve problems. That means we should be able to come up with solutions or at least recommendations. How would you feel if whenever you asked your doctor about your health, your doctor replied, “it depends”? Would you start questioning your doctor’s competency? I thought so.

It’s reflective of a fear of design and a fear of taking responsibility. It also reflects a lack of knowledge of the guiding principles from various fields that can be used in dealing with our Big Gnarly Problems.TM These aren’t “rules,” but rather “rules of thumb.” They don’t provide simple answers and often involve subtle interrelationships among the different principles at work. Sometimes they can be contradictory—just as Newtonian and Einsteinian physics contradict each other. That’s because, much like physics, different rules of thumb are best applied in particular situations.

A big part of being the expert means applying your best judgment to complex and uncertain problems even in the face of incomplete information. It’s hard, but that’s what we get paid for. If creating good user experiences was easy, everyone would be doing it already.

There are no easy answers. But to violate everything I’ve just said, let me suggest there’s at least one easy step: start by saying, “It can depend, but, in this context, here’s what I recommend…”

Fear of Design

Written by: Christina Wodtke

Not so long ago, on my personal site I posted a little entry on design. And a comment was made: “IA is not design.” This sentence has sat vibrating in my head for months. It speaks of bravado in the face of fear. But why should Information Architects fear design?

Every time we make something, we are leaping out of an airplane and all the research in the world is just us packing our parachute carefully. Information Architecture is design. We are afraid to admit it, but IA is surely design as much as Interaction Design is design, Architecture is design, and Engineering is design. In each of these activities we create.

The nature of design is to make, with its accompanying activities of refining, organizing and surfacing. We look at the world, we think, we call upon our trained gut and we make something. We then refine that little germ of a design through skills acquired over time, we organize the designs into a consistent whole, and we create a surface to make that whole palatable to the consumer. If you think IA has nothing to do with surfaces, think of labels or navigation structures. We may not always choose the color, but we are deeply concerned with surfaces because they are the final manifestation of our design.

Usability is criticism. It looks at the designer’s creations and says “I have evaluated on X, Y and Z and found it wanting in A, B and C” Then usability specialists are free to leave the room. They’ve done their piece; they can now sit back and wait for the next creation. It’s valuable, it informs and improves our work, and it’s safe – emotionally—for the practitioner.

User research informs design. You learn how people work, how they dream, their desires and fears and habits. A user researcher observes people’s behavior and then they write up a nice report: user x likes this, user y tends to do that. But someone has to make a leap from this information into an actual creation. Someone has to be ballsy enough to say “User Y tends to do that so the button goes HERE.” It’s the same with business analysts, or requirements gathering. However, at some point you have to leave the safe haven of information gathering into the uncertain grounds of design. At some time you have to screw up the courage and make something.

Why are we afraid of design? Because if we are designers, we will have to be responsible for our designs. Researchers and critiques can shrug and say, well those are the facts. But designers must stand tall and say, “That was the solution I came up with.” The designer and the design are not so easily separated. It takes an iron grip on one’s ego to take criticism on one’s designs, no matter if it’s a thesaurus or a front page of a website. Crafting a design is an attentive and loving act. It makes one vulnerable, and I suspect some IA’s think that by donning Jesse James Garrett’s Lab Coats , they can trick themselves into separating themselves from the design and getting emotional distance.

“I have studied this problem at great length and the solution is indicated by the data.”

My design is perfect.

Bullhockey.

The web is too new—heck, software design is too new—for us to say there is a clear and easy answer when we design. Every time we make something, we are leaping out of an airplane and all the research in the world is just us packing our parachute carefully. The landing will still be felt.

Graphic designers have fought this vertigo for years. They’ve learned to articulate a defense for their design in presentations, they learn to explain their rationale in hopes of slowing the free-fall and they even have protective gear for when they jump (lately seen outside a flash conference: a gaggle of designers all in horn-rim glasses and Italian shoes).

But they know and I know that bad landings happen. Designers get pulled off projects and their ego is bruised. Feeling hurt is how they should feel. If their ego wasn’t bruised, they weren’t trying hard enough. Professionalism means they don’t show it, but if they are good designers, they care. And caring means feeling pain sometimes.

So are we, designers of digital experiences, architects of information, ready to take on that potential pain in order to make good work? Are we ready to take in information, but not hide behind it? Will we be responsible for our creations, will we to put our ego in the plane?

Do we have the courage to design?