The Hidden History of Information Management

Written by: Bob Goodman

The fictional heavy-metal band Spinal Tap immortalized the “fine line between clever and stupid.” It’s a similar situation with information access: there’s a fine line between rich and broke. Put another way (by the late cognitive psychologist Hebert Simon): “a wealth of information creates a poverty of attention.”

Today the poverty of attention seems especially pressing. Technology makes it easier and cheaper to store information of all kinds, far outpacing our ability to convert that information into meaning and knowledge. On the plus side for B&A readers, this situation seems likely to keep information architects gainfully employed for some time to come.

But on a broader cultural and historical level, what strategies has society employed to collect, manage, and store information, even with the constant threat of oversupply, and still make this information accessible and meaningful to people over time?

An answer to that question—in fact, many answers—can be found in Glut: Mastering Information Through The Ages, a sweeping new book from Alex Wright about the history of the information and information management systems across disciplines, time, and even across species (bees, ants, primates, eukaryotes.)

Wright, a librarian turned writer and information architect, is no stranger to the Boxes and Arrows community, and in fact, he draws on material from two B&A articles (on IA and sociobiology, another on Belgian bibliographer Paul Otlet) in his new book, now set in a broader narrative. Glut is an informative, ambitious, and at times frustrating work, as Wright juggles three different roles in shepherding his material: tour guide, curator, and essayist.

Wright The Tour Guide

As a tour guide, Wright is a patient, well-informed, and focused narrator, exploring the roots of information systems including writing, classification schemes, books, and libraries. In this mode, his sweeping connection-making is somewhat akin to the work of science historian and BBC documentarian James Burke (a fan of Glut) in its quest for hidden connections between seemingly disparate subjects and causes.

Wright informs us at the outset that he will avoid the lure of utopian techno-futurism and excavate the story of the information age by looking “squarely backward.” Just how far backward? Two billion years ago for the information architecture practices of multi-cell organisms, and for Homo sapiens, try the Ice Age (about 45,000 years ago.) That, Wright tells us, is when our cave-dwelling ancestors started banding together for survival in the face of tougher hunting conditions.

While today we think the biggest challenge of glut is the ensuing time and attention management crunch, Glut reminds us that information acquisition did not come easy in the early days of empire building. A central challenge for many cultures was the amassing of material for that key information storehouse—the library—and trying to protect these centralized physical and intellectual assets from violent destruction:

“From ancient Sumer to India to China to the Aztec kingdom, the same pattern manifested again and again: first came literacy, then the nation-state, the empire, and ultimately the intellectual apotheosis of the empire, the library. When empires fall, they usually take their libraries with them.”

Among some of the other intriguing stops and observations along Wright’s tour:

  • Beads and pendants served as a very early symbolic communication for Ice Age Homo sapiens, allowing people to create bonds and achieve more complex social connections.
  • “Meta” text of a sort dates as far back as 2300 BC; archeologists have found 2000 tablets including lists of animals and locations as well as listing other tablets.
  • Google’s controversial book-scanning effort seems not far afield from the acquisition policy described by Wright for the Alexandrian library: “The Alexandrian rules built the great library not just as an act of imperial generosity but also through fiat, confiscation, and occasionally, subterfuge.

Wright The Curator

Part of Glut feels like an information management museum in book form, and Wright evinces a strong curatorial preference for the quixotic. There’s a sense that he hopes to shift our cultural focus from history’s hit makers to a number of lesser known but meritorious information management ideas from the past that deserve further airtime today.

For example, when Wright works his way up to recent computer history, he avoids focusing on the already well-told and well-documented human-computer interaction story of ARC, PARC, and Apple. Instead, he favors lesser-known milestones in the history of hypertext, with a fresh look at Ted Nelson and several groundbreaking experiments at Brown University (Wright’s undergraduate alma matter).

The Brown University story culminates in a project called Intermedia, which included many features that Wright finds lacking in today’s Web framework, including bi-directional linking (both pointers and targets “know” of the link), and real-time dynamic editing and updating. The project vanished for lack of federal funding in 1994, just before the World Wide Web stepped onto the global stage.

But central exhibit in this wing is Otlet, the 19th century Belgian bibliographer whom Wright dubs as the Internet’s forgotten forefather. Otlet is best known as the developer of the Universal Decimal Classification (UDC), a flexible and faceted library classification system in widespread use today worldwide across 23 languages.

Glut focuses on Otlet’s vision for something remarkably similar to today’s World Wide Web, and his efforts to realize it with a kind of manual database comprised of 12 million facts kept on index cards in an office he called the Mundaneum to which readers could submit queries for a small fee.

Otlet hoped that ultimately anyone would be able to access all human knowledge across forms—books, records films, radio, television—remotely from their own homes on multi-windowed screens, and even went so far as to the words “Web” and “links.”

Due to financial constraints and dwindling government support, Otlet found his Mundaneum squeezed into progressively smaller accommodations including a parking lot until he finally shuttered the project in 1934; a few years later, Nazi troops carted it away.

Wright argues that in some ways, Otlet’s ideas not only foretold but also surpassed the current Web: “Distinguishing Otlet’s vision… is the conviction—long since fallen out of favor—in the possibility of a universal subject classification working in concert with the mutable social forces of scholarship.”

Wright The Essayist

One of Wright’s central themes is the pas-de-deux between networks and hierarchies, and the need to balance Web 2.0’s bottom-up, technology-enabled crowd wisdom with a classic sense of the individual expertise, scholarship, and merit guided by human hand.

Decrying what he describes as the utopian view that “hierarchical systems are restrictive, oppressive vehicles of control, while networks are open democratic vehicles of personal liberation,” Wright pursues a throughline across time in which networks and hierarchies are seen not only as competitive but also as potentially complimentary and reinforcing—even essential to one another:

“Networked systems are not entirely modern phenomena, nor are hierarchal systems necessarily doomed. There is a deeper story at work here. The fundamental tension between networks and hierarchies has percolated for eons. Today we are simply witnessing the latest installment in a long evolutionary drama.”

 

Wright the essayist is an elusive fellow: he combines humanism and pragmatism, and eschews the received techno-hype that is coming back into vogue in the Web 2.0 era. Yet he does not seem prepared to grab the bullhorn from Wright the historian or Wright the curator. Among the arguments Wright puts forward, as best as I can tease out:

  • Google’s page-rank algorithm risks reducing the presentation of information to a popularity contest; previous models throughout information management show the possibility of a more balanced and durable approach between classification by human hands (top down) and social meaning (bottom up).
  • Today’s Web links are inferior to the bi-directional hypertext linking explored in projects at Brown and envisioned by Ted Nelson and others, in which one linked resource would “know” about other links to it. The current state of hypertext doesn’t realize its full promise of helping to navigate information overload in a way that might better help advance human knowledge.
  • Aspects of the Web’s infrastructure (other than nascent Web 2.0 tools) favor one-way consumption rather than two-way discourse, and there’s an ongoing risk of excessive control by corporate interests and unseen technology gatekeepers.

On the book’s very last page, Wright touches on Wikipedia as a modern-day meeting ground for the pull and tug between networks and hierarchies, and notes Wikipedia’s creation of a new hierarchal review process to bolster its credibility. Coming so late and remaining so brief, the discussion seems an afterthought rather than what could have been a convergence of the book’s themes.

Information architects—and anyone curious about the roots of information management—will find much of interest in Glut’s thought-provoking tale. Given the stimulating and contrarian nature of Glut’s ideas, one only wishes Wright would occasionally return from the corridors of the time tunnel and bring his well-informed perspective back to our present age.

 

To get deeper into the book, “read the excerpt”:http://www.boxesandarrows.com/view/the-encyclopedic.

 

 

About the Book

“Glut: Mastering Information Through the Ages”:http://www.amazon.com/Glut-Mastering-Information-Through-Ages/dp/0309102383/boxesandarrows-20
Alex Wright
2007; Joseph Henry Press
ISBN-10: 0309102383

The Encyclopedic Revolution

Written by: Alex Wright
This excerpt adapted from Chapter 9, “The Encyclopedic Revolution.”

Despite the proliferation of books in the years after Gutenberg, three hundred years later books still remained prohibitively expensive for most Europeans. By the mid-eighteenth century, a typical educated household might own at most a single book (often a popular devotional text like the Book of Hours). Only scholars, clergymen and wealthy merchants could afford to own more than a few volumes. There was no such thing as a public library. Still, writers were producing new books in ever-growing numbers, and readers found it increasingly challenging – and often financially implausible – to stay abreast of new scholarship.

At the dawn of the Age of Enlightenment, a handful of philosophers, inspired by Francis Bacon’s quest for a unifying framework of human knowledge[1], started to envision a new kind of book that would synthesize the world’s (or at least Europe’s) intellectual output into a single, accessible work: the encyclopedia. Although encyclopedias had been around in one form or another since antiquity (originating independently in both ancient Greece and China), it was only in the eighteenth century that the general-purpose encyclopedia began to assume its modern form. In 1728, Ephraim Chambers published his Cyclopedia, a compendium of information about the arts and sciences that gained an enthusiastic following among the English literati. The book eventually caught the eye of a Parisian bookseller named André Le Breton, who decided to underwrite a French translation.

Diderot

Enter Denis Diderot. A prominent but financially struggling writer and philosopher, Diderot occasionally supplemented his income by translating English works into French. When Breton approached him about the Cyclopedia, he readily accepted the commission. Soon after embarking on the translation, however, he found himself entranced by the project. He soon persuaded Breton to support him in creating more than a simple translation. He wanted to turn the work into something bigger. Much bigger. He wanted to create a “universal” encyclopedia.

Adopting Bacon’s classification as his intellectual foundation, Diderot began the monumental undertaking that would eventually become the Encyclopédie ou Dictionnaire Raisonné des Sciences, des Arts et des Métiers (“Encyclopedia or Dictionary of the Sciences, the Arts, and the Professions”), published in a succession of volumes from 1751 to 1772. A massive collection of 72,000 articles written by 160 eminent contributors (including notables like Voltaire, Rousseau, and Buffon), Diderot turned the encyclopedia into a compendium of knowledge vaster than anything that had ever been published before.

Diderot did more than just survey the universe of printed books. He took the unprecedented step of expanding the work to include “folk” knowledge gathered from (mostly illiterate) tradespeople. The encyclopedia devoted an enormous portion of its pages to operational knowledge about everyday topics like cloth dying, metalwork, and glassware, with entries accompanied by detailed illustrations explaining the intricacies of the trades. Traditionally, this kind of knowledge had passed through word of mouth from master to apprentice among the well-established trade guilds. Since most of the practitioners remained illiterate, almost none of what they knew had ever been written down – and even if it had, it would have held little interest for the powdered-wig habitués of Parisian literary salons. Diderot’s encyclopedia elevated this kind of craft knowledge, giving it equal billing with the traditional domains of literate scholarship.

Figurative System of Human Knowledge

While publishing this kind of “how-to” information may strike most of us today as an unremarkable act, in eighteenth-century France the decision marked a blunt political statement. By granting craft knowledge a status equivalent to the aristocratic concerns of statecraft, scholarship, and religion – Diderot effectively challenged the legitimacy of the aristocracy. It was an epistemological coup d’ étate.

Diderot’s editorial populism also found expression in passages like this one: “The good of the people must be the great purpose of government. By the laws of nature and of reason, the governors are invested with power to that end. And the greatest good of the people is liberty.” To the royal and papal authorities of eighteenth century France, these were not particularly welcome sentiments. Pope Clement XIII castigated Diderot and his work (in part because Diderot had chosen to classify religion as a branch of philosophy). King George III of England and Louis XV of France also condemned it. His publisher was briefly jailed. In 1759 the French government ordered Diderot to cease publication, seizing 6,000 volumes, which they deposited (appropriately enough) inside the Bastille. But it was too late.

By the time the authorities came after Diderot’s work, the encyclopedia had already found an enthusiastic audience. By 1757 it had attracted 4000 dedicated subscribers (no small feat in pre-industrial France). Despite the official ban, Diderot and his colleagues continued to write and publish the encyclopedia in secret, and the book began to circulate widely among an increasingly restive French populace.

volume

Diderot died 10 years before the revolution of 1789, but his credentials as an Enlightenment encyclopedist would serve his family well in the bloody aftermath. When his son-in-law was imprisoned during the revolution and branded an aristocrat, Diderot’s daughter pleaded with the revolutionary committee, citing her father’s populist literary pedigree. On learning of the prisoner’s connection to the great encyclopedist, the committee immediately set him free.

What can we learn from Diderot’s legacy today? His encyclopedia provides an object lesson in the power of new forms of information technology to disrupt established institutional hierarchies. In synthesizing information that had previously been dispersed in local oral traditions and trade networks, Diderot created a radically new model for gathering and distributing information that challenged old aristocratic assumptions about the boundaries of scholarship – and in so doing, helped pave the way for a revolution.

Today, we are witnessing the reemergence of the encyclopedia as a force for radical epistemology. In recent years, Wikipedia’s swift rise to cultural prominence seems to echo Diderot’s centuries-old encyclopedic revolution. With more than three million entries in more than 100 languages, Wikipedia already ranks as by far the largest (and most popular) encyclopedia ever created. And once again, questions of authority and control are swirling. Critics argue that Wikipedia’s lack of quality controls leaves it vulnerable to bias and manipulation, while its defenders insist that openness and transparency ensure fairness and ultimately will allow the system to regulate itself. Just as in Diderot’s time, a deeper tension seems to be emerging between the forces of top-down authority (manifesting as journalists, publishers and academic scholars) and the bottom-up, quasi-anarchist ethos of the Web. And while no one has yet tried to lock Wikipedia up in the Bastille, literary worthies and assorted op-ed writers have condemned the work in sometimes vicious terms, while the prophets of techno-populism celebrate its arrival with an enthusiasm often bordering on zealotry. Once again, the encyclopedia may prove the most revolutionary “book” of all.

About the Book

“Glut: Mastering Information Through the Ages”:http://www.amazon.com/Glut-Mastering-Information-Through-Ages/dp/0309102383/boxesandarrows-20
Alex Wright
2007; Joseph Henry Press
ISBN-10: 0309102383

References

fn1. cf. Bacon’s Novum Organum of 1620

Success Stories

Written by: Clifton Evans

Success is a difficult thing. What exactly does it mean? Rising to the top, or getting what you want? Having respect for your achievements? Whatever it means, it’s a regular expression in The Netherlands. You know, that funny place sometimes referred to as Holland, where, as they say goodbye, they wave and say, ‘Success!’ Now, I’ve seen it happen occasionally in other places, but never with the same degree of bitter humor or comical irony. Whatever it actually means, the Dutch seem to suggest, ‘Success… it’s a new thing.’

The Dutch are, historically, very good designers, seeing design as a facet of their culture. Like architecture, design is a public necessity and a purveyor of improvement (or ironic comments on improvement). So, when something becomes improved, like the design of an interface, it is a success, but it’s still only a stepping stone to the next improvement. This idea hints at the problem with success stories. They capture the moment very well, but lead to the feeling that you have reached the end of the improvement, when quite regularly it is the opposite–you have simply just stepped a little farther towards a relatively unknown goal.

Designing Interactions by Bill Moggridge[1] does an excellent job of revealing the people and the work behind many of the most important interactive products of our time and discussing their impact on the field of interaction design. The products with stories in this book have dead simple design approaches behind them and should give us pride as designers, knowing that the best things out there have come from a relatively painless approach. We should be honest, however. This isn’t the whole story, as most of these products come from the efforts of multiple people, from integrating the opinions of the general public, to copying other designs, and, in fact, almost always some combination of all these things.

While it’s a great read, this book might lead you to believe otherwise, slightly, as it is biased towards the perspectives and histories of a few ‘successful’ designers, and not the entire output of any given design culture, never mind the much larger international culture of interaction design. One of the central themes is summarized early on in the book saying that the core skills of design are synthesis, understanding people, and iterative prototyping. While most designers can agree that this statement is very insightful, especially coming from Stu Card, one of the computer science brains at Xerox Parc in the seventies, it doesn’t take into account simple influences like access to production lines, distribution, backing, and the aforementioned. In that light, the statement comes off like a sales pitch to gain access to things that are necessary, but only relevant when you are already part of the industrial complex.

Still, a huge amount of valuable information lies in this tome, and the book should go on your shelf for a resource if nothing else. Be forewarned that there is a certain amount of social network back patting and ‘Apple Glorification’ in this book that is kind of scary. I’m a Mac user, always have been, but not because it is the supreme operating system design, but because it is slightly better, if that, than the only other major competitor on the market.

Now, not to get strung into the old debate, Designing Interactions does a good job of summarizing how the current mouse and windows operating system came to be. It does not provide tons of insight into what else was happening at that time. I’d like more stories from ‘the innovative seventies’, and how some of those ideas might have been able to help us if they had evolved. We all know we could use a period of cultural R&D like that in this field again, especially without the computer science (CS) focus. If you’re looking for a book that tells a bit more, check out Howard Riengold’s, Tools for Thought[2].

When pining for a period of innovation without the CS focus, I’m not saying that pure ‘design talent’ can solve all design problems, though it definitely helps, as you’ll see by reading the stories in Designing Interactions. The problem is that designers (through agencies, firms, shops, and individuals) are only responsible for a very small percentage of the designs out there, leaving many to be designed, by other means, technical, industrial, or other random approaches. As a result, most people get rare access to “decent” design.

Perhaps this is just a numbers game for establishing creative organizations. If there were more creative approaches out there, the market would reap the rewards and the creative approaches would prove their worth. As that has yet to be proven (or “proves” impossible), perhaps we should shift into a more creative approach?

To turn the coin on it’s head, before I get too strong minded about creative approaches, as much as that design is indeed an art form, design also has too strong a focus on the notion of “the elite,” and Designing Interactions certainly reflects that. To be part of true public awareness anyway, like in countries like the Netherlands, design requires a certain amount of separation from the industrial complex, or at least from the companies that are fixated on it. Creative development seems more about the culture in which it is created, and less about developing the best products for the highest bidder.

The book tends to be agreeable to this principle on average, including some examples of more responsible people designing for the culture they live in, not for “the future” or “the market.” Three examples in particular shed light on how design could be done, how the technology industry is indeed very backwards, and how most of us just twiddle our thumbs when it comes to creating making decent and responsible products.

Purple Moon, for those unaware, was a very innovative research project-turned-games company, led by Brenda Laurel, a guru in the interaction communities. Most innovative about the company was its focus on a completely untapped market in the IT industry in general, young girls. For that part, it was successful. Even its crash–like so many other decent dotcom era projects–fails to negate real success.

Not only did the Purple Moon empire have a huge member base, it was the first successful product, in perhaps the history of computing, with the young female market. In some ways, we’re actually talking about the Facebook of it’s time for little girls. To put it more bluntly, Purple Moon was the only product out of Silicon Valley, in it’s history, that would have appealed to any of the young mothers I know today. It’s a shame and a disgrace that nothing even remotely along these lines has been substantially perused since.

Another example of a responsibility-based project, in this book, is some of the work that the Live|Work outfit out of London put together. They focus on Service Design, looking at the ecologies of interactive systems, how things like banking and automobiles effect our everyday lives, and looking at solutions to some of the problems that these larger systems have in terms of interaction.

Live|Work thinks above and beyond everyday products and looks at the systems that those products operate within. Moggridge highlights one project, an automobile network for the UK, where new fuel-efficient models of cars, the Fiat Multi+, would be released on more of a licensing model, than an ownership model. Seeing the infrastructure realities of the automobile in Europe, particularly the cities, this project entailed working with the Italian manufacturer and the UK government to implement a more cost effective model of transportation, resulting in a more sustainable impact on the culture and the overall ecology surrounding it.

Another story revolves around something more elegant and well-designed than even the iPod. For an Epson “conceptual design” project, a group of design researchers at Ideo Tokyo created a set of printers more like furniture than appliances, more like tables and shelves than objects typically sitting on top of them. One printer even simply had a sheet draped over it, so that the printout slid out from underneath–very elegant and mysterious. The Epson project exemplifies an exercise where the focus is not on the technology, but the aesthetic impact on its resident environment.

Projects like this uncover that people’s unsaid desires. They would actually like to have printers like these. Less than “better designed,” “more elegant,” “fancy,” or even “Japanese,” we simply enjoy looking at these artifacts. They may even be “presentable” even and would make the most elegant computer (including a Macintosh) look robotic and foreign. Something hard-edged lies in our current technology, something unfamiliar. Projects like this showcase the potential of comfort with technology. As things become more ubiquitous, the need to create devices that are unobtrusive and familiar will be a governing factor.

While reading another story about the Will Wright and the making of SimCity, I overheard someone sitting next to me in the cafe say, “He makes nice scones.” I wondered to myself, can this guy behind this video game make decent scones? While he might be able to, would he share the recipe? Should we just ask him how it was done, or is it a “family recipe” secretly handed down through the generations? As Wright says in the book, SimCity is not one of those stupid shoot-up games. Perhaps it’s a valuable contribution to society then. Why not let other people know how it’s done, like those tasty scones? Or at least give us the basic ingredients.

This last thought implies the real problem that Moggridge works to reveal. I feel, though, that it is far too subtle in it’s approach to really “hit the nail on the head.” The primary theme of this book, other than the success stories of our favorite collectibles, is how most of the most popular designs were created with a “popular approach,” by an individual drawing on a napkin, guiding a secretary to imagine, to fantasize the ideal text editor on a blank monitor, chatting informally in the hallway, or packing up and going somewhere else where they were willing to listen. Just like cooking scones, these are everyday, ordinary scenarios, and that’s how great design is created. This book does a wonderful job of showing how success stories are just regular accounts.

For me, at least, with the success stories of the most creative companies out there, like Ideo, the focus lies in blending the business process with the creative. Even at these design-driven shops, they tend to lean heavily toward the process and not the creative as the real explanation of the work, or at least it’s value. There just aren’t many completely creative focused interaction design organizations out there. There are a ton of research, design, analytical, and technological driven organizations, all blending their offerings with creative to an extent, but only an extent. In this light, Moggridge paints a relatively pretty picture of a new wave of possibilities by showing that success is born not out of a process, but happens organically like everything else.

For all practical purposes, Designing Interactions is about Ideo and its connections to Silicon Valley, with the occasional Tokyo or MIT connection. The subtlety ends up being only partially gratuitous, with the connections thrown in for what seems to be a comparison. It’s an important book in that it bridges a relatively huge gap in understanding between Silicon Valley and the rest of the world in terms of what we should be doing with technology. Moggridge does a great job of bridging that gap by focusing on the projects at MIT, which have over the years resembled a lot of what the labs, artists, and design communities outside of North America consider to be part of interaction design.

While this book has the histories of Apple, hyperlinks, Google, SimCity, I-Mode, the iPod, the Palm Pilot, laptops and tablets, the main question that I feel this book stirs up is, “How are we going to reflect our culture with all this technology?”

From reading these success stories, my answer is, “We can’t represent our culture if the creation of all of our artifacts is done in secret.” Most cultures take part in the design of their handicrafts, their instruments, tools, utensils, equipment, toys and decorative artifacts, but what are we doing with technology? Quite the opposite.

Designing Interactions gives access to a very detailed and adept summarized history of commercial interaction design. It’s an invaluable resource to anyone who wants to know what happened to get us to this point, especially with the computer interfaces. But, again, it does beg the question to be answered, “Why did these few people have such an effect, something that more designers producing more varying designs could have had?”

To end with a final thought is based on an old expression, nature never produces the exact same thing twice. Should we all not be working to achieve this state of natural variation and symbiosis? We’ll not get there focusing just on success stories or processes, but we can certainly learn how they can help us feel confident in our own methods.

fn1. Moggridge, Bill. “Designing Interactions”:http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=10934. MIT Press; 2007. Buy from: “MIT Press”:http://mitpress.mit.edu/catalog/item/default.asp?ttype=2&tid=10934 | “Amazon”:http://www.amazon.com/Designing-Interactions-Bill-Moggridge/dp/0262134748/boxesandarrows-20

Design Is Rocket Science

Written by: Clifton Evans

I remember reading those Scientific American magazines when I was a kid. I liked them because the design of the magazine was funky, almost a 50’s image brought into the 80’s. It had a flair for interjecting human qualities, humor, lifestyle issues, even cosmetic thinking, in a way that no other ‘serious magazine’ really did. I, like so many other people, did not read it or even just look through it, for the amazing scientific breakthroughs that they reported, but because it was well designed. So, for me, it wasn’t a science magazine, it was good design, and that was rocket science.

“Rocket Science” is one of those expressions that conjures up a lot of thoughts, but mostly it means something is incredibly smart, basically breaching the impossible. Now, I find “The Impossible” breathtakingly exciting, the idea of something not being able to happen just somehow thrills me to bits. For example, it really makes me tick that it’s practically impossible to design a reasonably easy to use, or aesthetically interesting, computer interface. But, there are a thousand good suggestions on how to get started on such an endeavor this in this book.
“Interaction Design: Beyond Human-Computer Interaction”:http://www.wiley.com/WileyCDA/WileyTitle/productCd-0470018666.html [1] is cunningly released at a time when acceptance of Interaction Design as a discipline is reaching a critical mass. The book precipitates a huge turn in the creation of interactive technologies toward the more research/creative or human-centric model, approaching the subject of this change from different angles and illuminating historical insights.

The concept that practical research leads the way to good design is a good thing, but Interaction Design misses an opportunity, in some ways, by highlighting so many decent designs from only a research or technology-driven perspective. I never really understood how the field of Human-Computer Interaction is scientific anyway, so I’m glad to see the subtitle, “Beyond Human-Computer Interaction,” on the book, meaning a move toward “design and creative” in the discipline from a focus on hard-nosed research. It always struck me as an art form, to design computer software, and not a viable practice for using measurements and methodologies. Call me biased, but I feel science does a lot of legwork in trying to justify itself in the design of computer interfaces. Whereas, most people understand that designing a screen interface requires a creative approach.

The book sheds light on this aspect of HCI being a creative endeavor, but stays within the realm of the research, or semi-scientific, approach. Even as a social science, the dominant belief HCI research as the most effective way to design interfaces leaves too little room for real creative design talent. This book serves as a sign of the times by reflecting on this outlook.

It’s not that research isn’t appreciated in the design world (especially the findings), but my position is that some results could be found through sheer design approaches. The majority of successful applied designs include the conceptual, aesthetic, and semantic as well as input from the research-based approaches in this book. In my mind, however, sometimes the results of the research can be talked out in a few good casual conversations with other designers about the technology, placement, and end users.

The book does highlight quite a few good approaches that I use as a practitioner, so it certainly covers the reality of doing interaction design. In fact, every possible ethno-social-human-factors method under the sun is in this book, and it would be impossible to integrate many of them, even partially, into a real world project. It’s an excellent reference book for the shelf, and I know that I’ll refer to it often, even if I can’t use every approach in my projects.

It would be ideal to be able to use all of the information here. However, the reality of everyday design work is such that most of this research only really occurs in academia, amongst the most dedicated usability professionals, or within the lab environment. Unfortunately, these environments are not well known for their ability to produce interactions that are regarded as aesthetically pleasing by the general public. That said, I have employed a number of these approaches and have heard of almost all of them being used in the field, just likely not with the degree of formality that practitioners of traditional HCI tend to expect.

As a textbook for third or fourth year university students, graduate students may find parts of Interaction Design very interesting. It firmly plants the history of HCI accessibly for design students and takes the edge off of the more rigorous image that has accompanied user interface design research in the past. So, it’s a great book if you’re studying, working with a university or college, or just want to get up to snuff.

With the majority of material backed by research, it should be noted that this book is not light reading. While the approaches themselves are typically not about doing extensive research, an element of practicality pervades the discussions. Some students might find this attitude misleading, especially if the course they are on has more of a creative slant. But, if that’s your angle, there are tons of activities and processes in this book which will keep you learning for months.

Science and art can be combined wonderfully, especially when they are used in flexible and semantically meaningful ways. Students who read this book should be given the freedom and persuasion to integrate these techniques into their own approaches, so that they may avoid getting bogged down by the practicality of these methods. Products in the real world have used research and other practical approaches to create a more humane final design, and this book has a smattering of these example projects and products. Keep in mind that a personal touch helps humanize these approaches to fit them into creative design projects.

Interaction Design provides a lot of examples of successful design and will prove a great reference for the more pragmatic designers out there. The rational bent will help designers looking for explanations as to what it takes to do something well to why certain things work (e.g. iconography, different types of analysis).

The background information behind almost every approach and model out there is included, but alas, only a few of the examples are, unfortunately, elegant. They are research projects, so, they are not meant to be elegant. You might say that these types of projects are the stand-by of practitioners who recognize a problem, but who are not prepared to think of a more acceptable and effective approach. While the end design serves the purpose, unfortunately it does not do so with the inventiveness and personal value that shines clearly in products like Google Maps or the iPod Click Wheel.

Some examples of such technological determinism:
* The cascading menu: It’s an obviously difficult method of interacting with a system, but the researchers, developers and the people who put together the operating system SDK did not spend the requisite time inventing a more elegant approach.
* Speech interfaces: The reality of interacting with the system pales in comparison to the theory or the research behind it. Some companies now exploit this flaw by merely promise customers no phone trees or that calls will be answered in 2 rings or less.
* Pen-based (gestural) interfaces: Handwriting recognition software worked a lot better on the Newton than even the Palm OS, never mind the current offering on the Tablet PC.

In some ways, Interaction Design the practice is a field that seems obsessed with process over product. Experience has taught me that if overall the team lacks creative and artistic skills, the product is doomed to become unfriendly or inelegant. Essentially it boils down to politics, even within the smallest team. If there isn’t a general “agree-to-agree” mentality and a good amount of trust in the more creative members of the team, no amount of process, or developing a new one, will help make products that the customers want.

I approach the field from a design perspective, meaning two parts visual/creative, one part analytical public needs representative. When reading scientific books, journals, textbooks, I usually glance through them, looking for something inspirational, something logical, something that would make sense to the analytical side of my brain. I’m interested in the possibilities of the approaches, how they will affect my projects, and how they help me breach the impossibilities of science. I find it amazing how research and science struggle for elegance unless they also bring creative parts to bear.
Interaction Design, the book, presents many valuable approaches and background on the industry. Still, one should realize that learning this material is like learning to play the piano. You can follow many leads and avenues, especially in terms of extending your practice, but you’ll need creativity and artistry to exercise them well. Buy this book to support that good work, because you can never have enough background knowledge to do your job well.

fn1. Helen Sharp, Yvonne Rogers, and Jenny Preece; “Interaction Design: Beyond Human-Computer Interaction, 2nd Edition”:http://www.wiley.com/WileyCDA/WileyTitle/productCd-0470018666.html; John Wiley & Sons, Inc.; 2007.

Demolition Derby

Written by: James Robertson

Before I even opened this book, I had three reasons to like it. First, Scott Berkun is “one of us”. As a former Microsoft project manager responsible for overseeing early versions of Internet Explorer, he has a strong background in usability, information architecture, and design. His first book, “The Art of Project Management“:http://tinyurl.com/37q6j9 (also “reviewed”:http://www.boxesandarrows.com/view/the_art_of_project_management on Boxes and Arrows), might have been more appropriately titled, The Art of Project Management for Design-Intensive Projects. You might also know Berkun as the creator of the “Interactionary design contests”:http://www.uiweb.com/dsports/interactionary2001.htm held at “ACM’s SIGCHI conferences”:http://sigchi.org/conferences/. He comes from our world and many of his examples are drawn from individuals and organizations familiar to the IA and UX communities. Second, on a more personal level, the book includes two of my photos: See the title pages for chapters 5 and 6. The inclusion of these photos resulted from a request on “Berkun’s blog”:http://www.scottberkun.com/blog/ calling for Flickr-based photos, with two being plucked from my current collection of 3000+ images. Very exciting! Finally, The Myths of Innovation is a short, light book and a handy airplane read. Enough said.

The importance of innovation

Innovation is a hot topic at the moment. Actually, innovation has been a big thing for last hundred years or more, but perhaps we needed the profusion of business magazines and books to bring this observation into sharp focus. With the tech sector on the ascendancy (again), driven in part by the Web 2.0 movement, examples of innovation are everywhere. We’ve moved beyond the notion of the knowledge economy to recognize that innovative ideas can be the foundation for disruptive business models. This factor makes Berkun’s book timely, as it sheds light on the underpinning truths that surround innovation. This is what the dust jacket promises:

In The Myths of Innovation, bestselling author Scott Berkun takes a careful look at innovation history, including the software and Internet ages, to reveal how ideas truly become successful innovations–truths that you can apply to today’s challenges.
Using dozens of examples from the history of technology, business, and the arts, you’ll learn how to convert the knowledge you have into ideas that can change the world.

So, does it deliver?

Debunking myths

To explain how innovation works, Berkun starts in the opposite direction and first exposes ten commonly-held beliefs about innovation:
1. The myth of epiphany
2. We understand the history of innovation
3. There is a method for innovation
4. People love new ideas
5. The lone inventor
6. Good ideas are hard to find
7. Your boss knows more about innovation than you
8. The best ideas win
9. Problems and solutions
10. Innovation is always good

In each chapter a myth is introduced and then progressively unraveled and debunked with great wit and charm. This approach helps to structure the book and it offers an easy way to explore innovation. Berkun has a fluid writing style and finds the right balance between informality and powerful word-smithing.

Berkun uses a range of examples from the Renaissance to eBay and Craigslist. Each case study is carefully researched and accompanied by footnotes pointing to further reading. In many instances, Berkun takes unexpected angles on historical cases, presenting new perspectives on stories that have been told and retold for more than a generation. For example, most people are familiar with the story of Post-it notes: The 3M miracle product that evolved from a glue that didn’t stick properly. Far fewer know about the product that preceded Post-it notes (masking tape), and the company’s corporate history. 3M actually stands for Minnesota Mining and Manufacturing and the company started out drilling for underground minerals to manufacture grinding wheels. It was only after a lab assistant needed a way to mark borders for two-tone car painting that masking tape was developed and the rest became history. Another example explores the challenges in getting the telegraph adopted and how the company built on that discovery, Western Union, eventually became the protector of the status quo when new innovations came along–namely the telephone.

Through these examples, Berkun demonstrates that while inventions seem inevitable after the fact, the path to adoption is almost never certain. Great ideas fail, while commercial imperatives drive the success of other innovations.

Providing answers

Readers looking for an innovation checklist or a how-to book will be dissatisfied. One of the myths that Berkun debunks is that there can be a step-by-step guide to innovation. Instead, innovation is a complicated and unpredictable process with many paths–more jigsaw puzzle than a straight line. By its nature, innovation explores uncharted territory. It is also the product of a lot hard work, unexpected insights, the collaboration of many individuals, and sheer, random chance.

When I reached the end of the book, I was disappointed to discover there was not a summary chapter wrapping up its message; something akin to, “So therefore, based on these myths, this is how you need to do innovation in practice.” While a concluding chapter would have neatly closed the narrative arc at the end of the book, Berkun was right not have included one. Instead, the onus is on the reader to review the book again and allow the many gems scattered throughout the text sink in more.

In particular, Berkun outlines a number of key principles and barriers to innovation. They are presented in unassuming lists that belie their value. For example, he outlines eight challenges all innovations must confront and overcome, including sponsorship and funding, capacity for reproduction, and reaching the potential customer. In addition to these challenges, Berkun discusses elements that can influence the speed of adoption, challenges associated with managing innovation, and factors that have influenced historical innovations. Berkun also offers a comprehensive set of checkpoints that can be used to assess approaches to innovation.

What we can learn

There are many heroes idolized within our industry, whether it’s Flickr, eBay, Craigslist, 37 Signals, IDEO, Yahoo, Google, or any of the hundreds of Web 2.0 businesses. All of these organizations are regarded as paragons of innovation, featured prominently at conferences and in case studies. Berkun points out that while much can be learned from these organizations, the myths that surround them can also blindly lead us down the wrong path. If we recreate the funky, fun-filled spaces of the Googleplex, do we automatically become innovative? If we develop functionalities that mimic Flickr, will we be able to take on the world?

When starting down the path of innovation, we must do more than just blindly copy the formulas so neatly captured and communicated from these leading companies. Yes, we would like some measure of their success, but we would do better to learn from the myths outlined in this book. When we are establishing our design teams, building our startups, or consolidating our consulting firms, we need to consider the ideas presented in The Myths of Innovation. The lessons I took away from the book include the following:

  • Good management has a huge impact on the success of in-house innovation.
  • Innovation is paired with collaboration.
  • The best outcomes derive from a mix of self-awareness and the ability to recognize and explore opportunities when they arise.
  • Oh, and the need for perseverance, no matter how hard the road ahead.

The universal principles and insights captured by Berkun certainly apply to design and user testing. On page 66, Berkun makes the following observation:

“[Innovators] grow so focused on creating that they forget that those innovations are good only if people can use them. While there’s a lot to be said for raising bars and pushing envelops, breakthroughs happen for societies when innovations diffuse, not when they remain forever ahead of their time.”

Information architects, therefore, have an important role to play in innovation, particularly when making use of ethnographic research techniques. At the end of the day, we don’t win awards for demonstrating how smart or creative we are if no one chooses to make use of our wonderful new innovations. The more we understand our users or customers, the better we’ll be able to create innovations that make their lives easier. Innovation doesn’t happen in isolation, nor is it the result of being struck by a falling apple (or even a falling Apple?). Innovation occurs in the real world, drawn from an understanding of needs, and delivered through a design process that makes the idea into something that will change the world. This is where IAs can contribute.

Conclusion

I started The Myths of Innovation in a positive frame of mind, generated by my interest in the topic (and the excitement of seeing my photos in print). I ended the book similarly enthusiastic. While it isn’t a long read (I started in Cambridge and finished before I touched down in Los Angeles), good books don’t need a lot of words to make their point. Scott Berkun clearly presents his arguments, demolishing many of the misconception about innovation. For those of us running businesses or developing new products, it’s a must-read.

About the Book
“The Myths of Innovation”:http://www.amazon.com/Myths-Innovation-Scott-Berkun/dp/0596527055/boxesandarrows-20
Scott Berkun
2007, O’Reilly
ISBN-10: 0596527055

Authors note: If you want to view more of my book-worthy photos, you can find them on “Flickr”:http://www.flickr.com/photos/shingen_au, or on the site from “my first photography exhibition”:http://www.artbytwo.com.au/index.html.