The Tool Makes the (Wo)Man

by:   |  Posted on

The other day at work, we were planning some new processes for bringing work into the team. One team member suggested we use a product that another group was using to track our projects. The suggestion on the table essentially meant we would force fit our way of working into this tool “because we already had the tool.” This was proposed instead of doing the work to figure out how we needed to get our jobs done and then doing the due diligence to find the tool that best matched our needs.

The tool we should be cultivating here is our brain—our skill for problem solving and providing value to our clients and companies.This scene resonated with me because it is an example of not understanding the problem at hand. Jakob Nielsen’s exclamation “Flash is 99% Bad” is another example of barking up the wrong tree. He is now working with Macromedia to make the tool more usable—as if that was the source of the problem. What I can’t understand is why more people aren’t getting riled up about the fact that the problem isn’t the tool.

The SIGIA list occasionally erupts into the “Which tool do you use?” or “Which tool is best for information architecture/best for flow mapping/best for wireframing” conversations. Even Steve Krug noted this at the IA Summit in his Top Ten list of what IAs talk about. These questions arise as if the perfect tool would make the perfect IA. We lose sight in these discussions of the fact that we already have the perfect tool: our brains. The knowledge, expertise and skills to solve problems are right between our ears.

The visual manifestation of a solution—whether done in Illustrator, Omnigraffle, Visio, HTML, Flash or even on a cocktail napkin—is beside the point. If the solution is appropriate to the problem and the end user, then it doesn’t really matter how it is implemented.

But, you say, “the best, the right, the perfect tools will help us.”

“It will make us more efficient and give us more time to think, to solve problems.”

And I would say, you are right… to a degree.

Solving the problem will come from a deep understanding of the issues, of the users’ needs, of the task—from research, from analytical thinking and then sketching out solutions. Sketching these solutions can be done in any way—on a whiteboard, on paper with (gasp) a pen or pencil, or on the computer with the tool of choice.

My concern and angst over these types of discussions, as well as the kind of proclamations that Nielsen and other gurus make, is that focusing on the tool—either finding the right tool or badmouthing the perceived “wrong” tool—moves our energies away from the real problem at hand: design solutions that are inappropriately or poorly executed.

In all the talk of Flash being bad, I have never seen Nielsen and others offer to work with design schools or to help craft curricula, lessons or workshops that will teach the appropriate skills to the generation of designers who are being taught tool after tool rather than how to appropriately solve problems. So what’s my point? The tools of the trade that we use to solve our problems are mostly irrelevant. They come down to personal preference, to comfort level, to speed of learning and what others in the group are using, which is generally a concern when sharing documents. The tool we should be cultivating here is our brain—our skill for problem solving and providing value to our clients and companies.

The tools used to implement solutions (as opposed to the tools used to design solutions) also matter a little less than we’d like to think. Of course, the solutions need to be appropriate to the medium, to the end users’ needs and should solve the problem in the best way possible.

So even if Nielsen and Macromedia succeed in making rich media best practices 100 percent “good” (Macromedia press release, June 3, 2002), or even if someone comes along with the killer app for IA work, it still won’t matter much if designers and IAs don’t understand the medium or how best to solve the problem.

We have a responsibility to kick things back—to our bosses, to our clients, to our colleagues—when the recommendation to use a certain tool or technology just because it is there doesn’t fit the needs of the task, whether that task is designing a solution or implementing a solution. We have a responsibility to be smart problem-solvers and use the one tool that we all have—our brains.

(Over)simple Answers for Simple Minds

by:   |  Posted on

Hell hath no fury as those who’ve been attacking Jakob Nielsen on various user experience-related mailing lists in recent weeks over his decision to work with Macromedia on Flash-related usability issues after nearly two years of declaring “Flash 99 percent bad.” He’s being called a sell out, a hypocrite. The list goes on.

Part of me feels for Nielsen. After all, the pressures and temptations to provide simple answers to complex issues is one we all face in our professional practices. Even if you’re not a high-profile guru, you’re probably facing the similar pressures to come up with quick and easy answers to Big Gnarly Problems.

Let’s face it, if Nielsen had said “Flash is 50 percent bad” he wouldn’t be getting beaten up as much as he is—but no one would’ve paid as much attention to his original article either. I’ve spoken at a number of conferences and the more provocative you are the more audiences tend to listen. Plus being a pundit can cause a subtle feedback-loop that causes you to think your way is the best way for everyone—being known for a particular approach attracts clients with agree with your view and drives away others whose problems don’t fit your approach.

But even if you’re not a high-profile guru, you’re probably facing the similar pressures to come up with quick and easy answers to Big Gnarly Problems.TM Continue reading (Over)simple Answers for Simple Minds

Fear of Design

by:   |  Posted on

Not so long ago, on my personal site I posted a little entry on design. And a comment was made: “IA is not design.” This sentence has sat vibrating in my head for months. It speaks of bravado in the face of fear. But why should Information Architects fear design?

Every time we make something, we are leaping out of an airplane and all the research in the world is just us packing our parachute carefully. Information Architecture is design. We are afraid to admit it, but IA is surely design as much as Interaction Design is design, Architecture is design, and Engineering is design. In each of these activities we create.

The nature of design is to make, with its accompanying activities of refining, organizing and surfacing. We look at the world, we think, we call upon our trained gut and we make something. We then refine that little germ of a design through skills acquired over time, we organize the designs into a consistent whole, and we create a surface to make that whole palatable to the consumer. If you think IA has nothing to do with surfaces, think of labels or navigation structures. We may not always choose the color, but we are deeply concerned with surfaces because they are the final manifestation of our design.

Usability is criticism. It looks at the designer’s creations and says “I have evaluated on X, Y and Z and found it wanting in A, B and C” Then usability specialists are free to leave the room. They’ve done their piece; they can now sit back and wait for the next creation. It’s valuable, it informs and improves our work, and it’s safe – emotionally—for the practitioner.

User research informs design. You learn how people work, how they dream, their desires and fears and habits. A user researcher observes people’s behavior and then they write up a nice report: user x likes this, user y tends to do that. But someone has to make a leap from this information into an actual creation. Someone has to be ballsy enough to say “User Y tends to do that so the button goes HERE.” It’s the same with business analysts, or requirements gathering. However, at some point you have to leave the safe haven of information gathering into the uncertain grounds of design. At some time you have to screw up the courage and make something.

Why are we afraid of design? Because if we are designers, we will have to be responsible for our designs. Researchers and critiques can shrug and say, well those are the facts. But designers must stand tall and say, “That was the solution I came up with.” The designer and the design are not so easily separated. It takes an iron grip on one’s ego to take criticism on one’s designs, no matter if it’s a thesaurus or a front page of a website. Crafting a design is an attentive and loving act. It makes one vulnerable, and I suspect some IA’s think that by donning Jesse James Garrett’s Lab Coats , they can trick themselves into separating themselves from the design and getting emotional distance.

“I have studied this problem at great length and the solution is indicated by the data.”

My design is perfect.

Bullhockey.

The web is too new—heck, software design is too new—for us to say there is a clear and easy answer when we design. Every time we make something, we are leaping out of an airplane and all the research in the world is just us packing our parachute carefully. The landing will still be felt.

Graphic designers have fought this vertigo for years. They’ve learned to articulate a defense for their design in presentations, they learn to explain their rationale in hopes of slowing the free-fall and they even have protective gear for when they jump (lately seen outside a flash conference: a gaggle of designers all in horn-rim glasses and Italian shoes).

But they know and I know that bad landings happen. Designers get pulled off projects and their ego is bruised. Feeling hurt is how they should feel. If their ego wasn’t bruised, they weren’t trying hard enough. Professionalism means they don’t show it, but if they are good designers, they care. And caring means feeling pain sometimes.

So are we, designers of digital experiences, architects of information, ready to take on that potential pain in order to make good work? Are we ready to take in information, but not hide behind it? Will we be responsible for our creations, will we to put our ego in the plane?

Do we have the courage to design?

Arrows in Our Quiver

by:   |  Posted on

As I write this the Police’s “Synchronicity” is on the radio and that’s a good way of summing up some of the interesting developments experienced during the past few months.

On mailing lists, at conferences, in conversations at cocktail hours, I’m starting to see a growing awareness of how our various disciplines form a community of practice.

At last month’s SIG-CHI, I helped lead a workshop that looked at how traditional HCI (human-computer interaction) compares to the “new” information architecture by looking at our deliverables. What was striking were the similarities found between them. Looking at what we do, it was hard to tell if someone should be called information architect, interaction designer or usability engineer.

Obviously there were differences in focus depending on whether someone was working on a software application or a portal site. And often times the same deliverables or process might have completely different names. One person’s scenario was another person’s use case. But a deliverable by any other name still serves the same purpose. Hairsplitting can be fun, and I’ve done my share of it, but let’s not lose sight of things.

Alan Cooper made the point eloquently at the previous days’ SIGCHI/AIGA Experience Design Forum (which incidentally was the first joint event between the two organizations). Cooper called on people to end the terminology debate and learn to appreciate each other’s skills. We can choose which skills to learn, and each skill becomes an arrow in the quiver that we can use when needed.

That’s what we’re here to do at Boxes and Arrows: help us all learn about the wide variety of arrows that are available, and when and how to best use each one.

Which brings me to a complaint I’ve heard about Boxes and Arrows—it’s too much to read every word every week.

That’s fine. We won’t stop you if you don’t want to read the whole thing, but I like to think of our content as rich and varied buffet for our community of practices. Not everything may be particularly relevant or compelling for you during a particular week. If that’s the case, we hope you’ll check back next week, since the buffet will always be changing.

But more than that we hope you’ll let us know what’s not on the menu that’s particularly appetizing to you. What topics should we be covering? Let us know.

We’re particularly interested in hearing how we can not only talk among ourselves but talk to the business people who set the direction and the technologists who make things happen.

Having a successful product (including websites and software) requires hitting the right intersection among business goals, technological feasibility and design desirability. So Boxes and Arrows hopes to help integrate our own wide array of skills with others’ skills, in order to have the fullest and richest quiver available in our joint efforts to hit that target.

(P.S. Thanks to Keith Instone, Peter Boersma and Lisa Chan who co-organized the HCI/IA workshop.)

Marla Olsen
Editor, Chief Curmudgeon
Boxes and Arrows

Speaking in Tongues

by:   |  Posted on

In last month’s welcome, I set out to describe Boxes and Arrows purpose and goals. On a line by itself I stated this is not a place for jargon. I felt that was important enough to call out. I certainly am being called to task for that.

Jargon is not using a fancy word appropriately, but it is jargon when the fancy word replaces a simpler correct word.

Perhaps I should have stated this will be a place free of jargon. Ridding our writing of jargon is a good goal, but a more complex task than one might think. That said, it’s important to define what jargon is and what jargon isn’t.

Jargon is words used as a gating mechanism. We use jargon when we wish to keep out those who are not like “us” whomever “us” may be. Jargon is when we replace perfectly good accessible English with slang, acronyms and other mangled phraseology. “Monetize” was a dot-com jargon term. It meant, “find a way to make a profit from” and was used partially out of laziness and partially to make people using the word feel like insiders (and perhaps not morons who forgot they had to make a dime on their crazy schemes) 80-20 was a rule for profits—20 percent of your users provide 80 percent of your profit—that became a noun. “Well Joe, the way I see it, it’s an 80-20.“

Jargon is not using a fancy word appropriately, but it is jargon when the fancy word replaces a simpler correct word. Paradigm has often given me fits because it is a perfectly good word… it’s just been abused. People often use it when “model” is probably a better choice. Utilize frequently replaces use when use is the right word. But there is an appropriate time to use utilize… when one means use for profit. We may even choose to utilize jargon if it will serve our sinister purposes in undermining the current design paradigm—but not if there is a better way—a clear, simple ordinary language way.

And jargon is not using a big word that you have to look up. Sometimes when we seek to be precise, we use big words. It happens. A dictionary is a good investment.

Acronyms happen. We have to stay alert for them. One man’s A List Apart is another woman’s American Library Association. ALA means different things depending on what crowd you run with.

New words are born when no word existed previously. It wasn’t that long ago that there was no such thing as an internet, or a CPU, or a handheld. To refuse to use these terms because they might be perceived as jargon would be foolishly handicapping ourselves in the service of communicating.

Finally our authors deserve to be allowed to be eloquent. Adam Greenfield’s style is not Jess McMullin’s, and neither writes like Nathan Shedroff. Nor would we want them to: Boxes and Arrows is composed of people, with a myriad of different voices and different word choices. We will edit to keep their writing accessible, but we will endeavor not to kill the poetry of their language. Writing is a scary and vulnerable activity. An author deserves to have his or her words respected, and editing should enhance and not squash.

So with all these challenges, why try? We try because Boxes and Arrows seeks to be inclusive, not exclusive. We want to cross lines to learn and communicate, and jargon is, as I said, a gating mechanism. So I’ll stick with my earlier statement, though I’ll modify it somewhat:

We will seek to keep this place free of jargon. We will enlist you, the reader to keep us honest. Every article has a discuss link, call us out on the carpet when we say LIS-IA, or directing eyeballs. Definitely bust us when we complain ED is not as good as UX because the CHI’ers are more user-centric in their dev-cycles because of the x-mod they do, while ED is all amusement parks and des9.

In return we’ll do our level best to talk straight.

Christina Wodtke
Publisher