The Encyclopedic Revolution

by:   |  Posted on
This excerpt adapted from Chapter 9, “The Encyclopedic Revolution.”

Despite the proliferation of books in the years after Gutenberg, three hundred years later books still remained prohibitively expensive for most Europeans. By the mid-eighteenth century, a typical educated household might own at most a single book (often a popular devotional text like the Book of Hours). Only scholars, clergymen and wealthy merchants could afford to own more than a few volumes. There was no such thing as a public library. Still, writers were producing new books in ever-growing numbers, and readers found it increasingly challenging – and often financially implausible – to stay abreast of new scholarship.

At the dawn of the Age of Enlightenment, a handful of philosophers, inspired by Francis Bacon’s quest for a unifying framework of human knowledge[1], started to envision a new kind of book that would synthesize the world’s (or at least Europe’s) intellectual output into a single, accessible work: the encyclopedia. Although encyclopedias had been around in one form or another since antiquity (originating independently in both ancient Greece and China), it was only in the eighteenth century that the general-purpose encyclopedia began to assume its modern form. In 1728, Ephraim Chambers published his Cyclopedia, a compendium of information about the arts and sciences that gained an enthusiastic following among the English literati. The book eventually caught the eye of a Parisian bookseller named André Le Breton, who decided to underwrite a French translation.

Diderot

Enter Denis Diderot. A prominent but financially struggling writer and philosopher, Diderot occasionally supplemented his income by translating English works into French. When Breton approached him about the Cyclopedia, he readily accepted the commission. Soon after embarking on the translation, however, he found himself entranced by the project. He soon persuaded Breton to support him in creating more than a simple translation. He wanted to turn the work into something bigger. Much bigger. He wanted to create a “universal” encyclopedia.

Adopting Bacon’s classification as his intellectual foundation, Diderot began the monumental undertaking that would eventually become the Encyclopédie ou Dictionnaire Raisonné des Sciences, des Arts et des Métiers (“Encyclopedia or Dictionary of the Sciences, the Arts, and the Professions”), published in a succession of volumes from 1751 to 1772. A massive collection of 72,000 articles written by 160 eminent contributors (including notables like Voltaire, Rousseau, and Buffon), Diderot turned the encyclopedia into a compendium of knowledge vaster than anything that had ever been published before.

Diderot did more than just survey the universe of printed books. He took the unprecedented step of expanding the work to include “folk” knowledge gathered from (mostly illiterate) tradespeople. The encyclopedia devoted an enormous portion of its pages to operational knowledge about everyday topics like cloth dying, metalwork, and glassware, with entries accompanied by detailed illustrations explaining the intricacies of the trades. Traditionally, this kind of knowledge had passed through word of mouth from master to apprentice among the well-established trade guilds. Since most of the practitioners remained illiterate, almost none of what they knew had ever been written down – and even if it had, it would have held little interest for the powdered-wig habitués of Parisian literary salons. Diderot’s encyclopedia elevated this kind of craft knowledge, giving it equal billing with the traditional domains of literate scholarship.

Figurative System of Human Knowledge

While publishing this kind of “how-to” information may strike most of us today as an unremarkable act, in eighteenth-century France the decision marked a blunt political statement. By granting craft knowledge a status equivalent to the aristocratic concerns of statecraft, scholarship, and religion – Diderot effectively challenged the legitimacy of the aristocracy. It was an epistemological coup d’ étate.

Diderot’s editorial populism also found expression in passages like this one: “The good of the people must be the great purpose of government. By the laws of nature and of reason, the governors are invested with power to that end. And the greatest good of the people is liberty.” To the royal and papal authorities of eighteenth century France, these were not particularly welcome sentiments. Pope Clement XIII castigated Diderot and his work (in part because Diderot had chosen to classify religion as a branch of philosophy). King George III of England and Louis XV of France also condemned it. His publisher was briefly jailed. In 1759 the French government ordered Diderot to cease publication, seizing 6,000 volumes, which they deposited (appropriately enough) inside the Bastille. But it was too late.

By the time the authorities came after Diderot’s work, the encyclopedia had already found an enthusiastic audience. By 1757 it had attracted 4000 dedicated subscribers (no small feat in pre-industrial France). Despite the official ban, Diderot and his colleagues continued to write and publish the encyclopedia in secret, and the book began to circulate widely among an increasingly restive French populace.

volume

Diderot died 10 years before the revolution of 1789, but his credentials as an Enlightenment encyclopedist would serve his family well in the bloody aftermath. When his son-in-law was imprisoned during the revolution and branded an aristocrat, Diderot’s daughter pleaded with the revolutionary committee, citing her father’s populist literary pedigree. On learning of the prisoner’s connection to the great encyclopedist, the committee immediately set him free.

What can we learn from Diderot’s legacy today? His encyclopedia provides an object lesson in the power of new forms of information technology to disrupt established institutional hierarchies. In synthesizing information that had previously been dispersed in local oral traditions and trade networks, Diderot created a radically new model for gathering and distributing information that challenged old aristocratic assumptions about the boundaries of scholarship – and in so doing, helped pave the way for a revolution.

Today, we are witnessing the reemergence of the encyclopedia as a force for radical epistemology. In recent years, Wikipedia’s swift rise to cultural prominence seems to echo Diderot’s centuries-old encyclopedic revolution. With more than three million entries in more than 100 languages, Wikipedia already ranks as by far the largest (and most popular) encyclopedia ever created. And once again, questions of authority and control are swirling. Critics argue that Wikipedia’s lack of quality controls leaves it vulnerable to bias and manipulation, while its defenders insist that openness and transparency ensure fairness and ultimately will allow the system to regulate itself. Just as in Diderot’s time, a deeper tension seems to be emerging between the forces of top-down authority (manifesting as journalists, publishers and academic scholars) and the bottom-up, quasi-anarchist ethos of the Web. And while no one has yet tried to lock Wikipedia up in the Bastille, literary worthies and assorted op-ed writers have condemned the work in sometimes vicious terms, while the prophets of techno-populism celebrate its arrival with an enthusiasm often bordering on zealotry. Once again, the encyclopedia may prove the most revolutionary “book” of all.

About the Book

“Glut: Mastering Information Through the Ages”:http://www.amazon.com/Glut-Mastering-Information-Through-Ages/dp/0309102383/boxesandarrows-20
Alex Wright
2007; Joseph Henry Press
ISBN-10: 0309102383

References

fn1. cf. Bacon’s Novum Organum of 1620

Success Stories

by:   |  Posted on

Success is a difficult thing. What exactly does it mean? Rising to the top, or getting what you want? Having respect for your achievements? Whatever it means, it’s a regular expression in The Netherlands. You know, that funny place sometimes referred to as Holland, where, as they say goodbye, they wave and say, ‘Success!’ Now, I’ve seen it happen occasionally in other places, but never with the same degree of bitter humor or comical irony. Whatever it actually means, the Dutch seem to suggest, ‘Success… it’s a new thing.’

The Dutch are, historically, very good designers, seeing design as a facet of their culture. Like architecture, design is a public necessity and a purveyor of improvement (or ironic comments on improvement). So, when something becomes improved, like the design of an interface, it is a success, but it’s still only a stepping stone to the next improvement. This idea hints at the problem with success stories. They capture the moment very well, but lead to the feeling that you have reached the end of the improvement, when quite regularly it is the opposite–you have simply just stepped a little farther towards a relatively unknown goal.

Designing Interactions by Bill Moggridge[1] does an excellent job of revealing the people and the work behind many of the most important interactive products of our time and discussing their impact on the field of interaction design. Continue reading Success Stories

Design Is Rocket Science

by:   |  Posted on

I remember reading those Scientific American magazines when I was a kid. I liked them because the design of the magazine was funky, almost a 50’s image brought into the 80’s. It had a flair for interjecting human qualities, humor, lifestyle issues, even cosmetic thinking, in a way that no other ‘serious magazine’ really did. I, like so many other people, did not read it or even just look through it, for the amazing scientific breakthroughs that they reported, but because it was well designed. So, for me, it wasn’t a science magazine, it was good design, and that was rocket science.

“Rocket Science” is one of those expressions that conjures up a lot of thoughts, but mostly it means something is incredibly smart, basically breaching the impossible. Now, I find “The Impossible” breathtakingly exciting, the idea of something not being able to happen just somehow thrills me to bits. For example, it really makes me tick that it’s practically impossible to design a reasonably easy to use, or aesthetically interesting, computer interface. But, there are a thousand good suggestions on how to get started on such an endeavor this in this book.
“Interaction Design: Beyond Human-Computer Interaction”:http://www.wiley.com/WileyCDA/WileyTitle/productCd-0470018666.html [1] is cunningly released at a time when acceptance of Interaction Design as a discipline is reaching a critical mass. The book precipitates a huge turn in the creation of interactive technologies toward the more research/creative or human-centric model, approaching the subject of this change from different angles and illuminating historical insights.

Continue reading Design Is Rocket Science

Demolition Derby

by:   |  Posted on

Before I even opened this book, I had three reasons to like it. First, Scott Berkun is “one of us”. As a former Microsoft project manager responsible for overseeing early versions of Internet Explorer, he has a strong background in usability, information architecture, and design. His first book, “The Art of Project Management“:http://tinyurl.com/37q6j9 (also “reviewed”:http://www.boxesandarrows.com/view/the_art_of_project_management on Boxes and Arrows), might have been more appropriately titled, The Art of Project Management for Design-Intensive Projects. You might also know Berkun as the creator of the “Interactionary design contests”:http://www.uiweb.com/dsports/interactionary2001.htm held at “ACM’s SIGCHI conferences”:http://sigchi.org/conferences/. He comes from our world and many of his examples are drawn from individuals and organizations familiar to the IA and UX communities. Second, on a more personal level, the book includes two of my photos: See the title pages for chapters 5 and 6. The inclusion of these photos resulted from a request on “Berkun’s blog”:http://www.scottberkun.com/blog/ calling for Flickr-based photos, with two being plucked from my current collection of 3000+ images. Very exciting! Finally, The Myths of Innovation is a short, light book and a handy airplane read. Enough said.

The importance of innovation

Innovation is a hot topic at the moment. Actually, innovation has been a big thing for last hundred years or more, but perhaps we needed the profusion of business magazines and books to bring this observation into sharp focus. With the tech sector on the ascendancy (again), driven in part by the Web 2.0 movement, examples of innovation are everywhere. We’ve moved beyond the notion of the knowledge economy to recognize that innovative ideas can be the foundation for disruptive business models. This factor makes Berkun’s book timely, as it sheds light on the underpinning truths that surround innovation. This is what the dust jacket promises:

In The Myths of Innovation, bestselling author Scott Berkun takes a careful look at innovation history, including the software and Internet ages, to reveal how ideas truly become successful innovations–truths that you can apply to today’s challenges.
Using dozens of examples from the history of technology, business, and the arts, you’ll learn how to convert the knowledge you have into ideas that can change the world.

So, does it deliver?

Debunking myths

To explain how innovation works, Berkun starts in the opposite direction and first exposes ten commonly-held beliefs about innovation:
1. The myth of epiphany
2. We understand the history of innovation
3. There is a method for innovation
4. People love new ideas
5. The lone inventor
6. Good ideas are hard to find
7. Your boss knows more about innovation than you
8. The best ideas win
9. Problems and solutions
10. Innovation is always good

In each chapter a myth is introduced and then progressively unraveled and debunked with great wit and charm. This approach helps to structure the book and it offers an easy way to explore innovation. Berkun has a fluid writing style and finds the right balance between informality and powerful word-smithing.

Berkun uses a range of examples from the Renaissance to eBay and Craigslist. Each case study is carefully researched and accompanied by footnotes pointing to further reading. In many instances, Berkun takes unexpected angles on historical cases, presenting new perspectives on stories that have been told and retold for more than a generation. For example, most people are familiar with the story of Post-it notes: The 3M miracle product that evolved from a glue that didn’t stick properly. Far fewer know about the product that preceded Post-it notes (masking tape), and the company’s corporate history. 3M actually stands for Minnesota Mining and Manufacturing and the company started out drilling for underground minerals to manufacture grinding wheels. It was only after a lab assistant needed a way to mark borders for two-tone car painting that masking tape was developed and the rest became history. Another example explores the challenges in getting the telegraph adopted and how the company built on that discovery, Western Union, eventually became the protector of the status quo when new innovations came along–namely the telephone.

Through these examples, Berkun demonstrates that while inventions seem inevitable after the fact, the path to adoption is almost never certain. Great ideas fail, while commercial imperatives drive the success of other innovations.

Providing answers

Readers looking for an innovation checklist or a how-to book will be dissatisfied. One of the myths that Berkun debunks is that there can be a step-by-step guide to innovation. Instead, innovation is a complicated and unpredictable process with many paths–more jigsaw puzzle than a straight line. By its nature, innovation explores uncharted territory. It is also the product of a lot hard work, unexpected insights, the collaboration of many individuals, and sheer, random chance.

When I reached the end of the book, I was disappointed to discover there was not a summary chapter wrapping up its message; something akin to, “So therefore, based on these myths, this is how you need to do innovation in practice.” While a concluding chapter would have neatly closed the narrative arc at the end of the book, Berkun was right not have included one. Instead, the onus is on the reader to review the book again and allow the many gems scattered throughout the text sink in more.

In particular, Berkun outlines a number of key principles and barriers to innovation. They are presented in unassuming lists that belie their value. For example, he outlines eight challenges all innovations must confront and overcome, including sponsorship and funding, capacity for reproduction, and reaching the potential customer. In addition to these challenges, Berkun discusses elements that can influence the speed of adoption, challenges associated with managing innovation, and factors that have influenced historical innovations. Berkun also offers a comprehensive set of checkpoints that can be used to assess approaches to innovation.

What we can learn

There are many heroes idolized within our industry, whether it’s Flickr, eBay, Craigslist, 37 Signals, IDEO, Yahoo, Google, or any of the hundreds of Web 2.0 businesses. All of these organizations are regarded as paragons of innovation, featured prominently at conferences and in case studies. Berkun points out that while much can be learned from these organizations, the myths that surround them can also blindly lead us down the wrong path. If we recreate the funky, fun-filled spaces of the Googleplex, do we automatically become innovative? If we develop functionalities that mimic Flickr, will we be able to take on the world?

When starting down the path of innovation, we must do more than just blindly copy the formulas so neatly captured and communicated from these leading companies. Yes, we would like some measure of their success, but we would do better to learn from the myths outlined in this book. When we are establishing our design teams, building our startups, or consolidating our consulting firms, we need to consider the ideas presented in The Myths of Innovation. The lessons I took away from the book include the following:

  • Good management has a huge impact on the success of in-house innovation.
  • Innovation is paired with collaboration.
  • The best outcomes derive from a mix of self-awareness and the ability to recognize and explore opportunities when they arise.
  • Oh, and the need for perseverance, no matter how hard the road ahead.

The universal principles and insights captured by Berkun certainly apply to design and user testing. On page 66, Berkun makes the following observation:

“[Innovators] grow so focused on creating that they forget that those innovations are good only if people can use them. While there’s a lot to be said for raising bars and pushing envelops, breakthroughs happen for societies when innovations diffuse, not when they remain forever ahead of their time.”

Information architects, therefore, have an important role to play in innovation, particularly when making use of ethnographic research techniques. At the end of the day, we don’t win awards for demonstrating how smart or creative we are if no one chooses to make use of our wonderful new innovations. The more we understand our users or customers, the better we’ll be able to create innovations that make their lives easier. Innovation doesn’t happen in isolation, nor is it the result of being struck by a falling apple (or even a falling Apple?). Innovation occurs in the real world, drawn from an understanding of needs, and delivered through a design process that makes the idea into something that will change the world. This is where IAs can contribute.

Conclusion

I started The Myths of Innovation in a positive frame of mind, generated by my interest in the topic (and the excitement of seeing my photos in print). I ended the book similarly enthusiastic. While it isn’t a long read (I started in Cambridge and finished before I touched down in Los Angeles), good books don’t need a lot of words to make their point. Scott Berkun clearly presents his arguments, demolishing many of the misconception about innovation. For those of us running businesses or developing new products, it’s a must-read.

About the Book
“The Myths of Innovation”:http://www.amazon.com/Myths-Innovation-Scott-Berkun/dp/0596527055/boxesandarrows-20
Scott Berkun
2007, O’Reilly
ISBN-10: 0596527055

Authors note: If you want to view more of my book-worthy photos, you can find them on “Flickr”:http://www.flickr.com/photos/shingen_au, or on the site from “my first photography exhibition”:http://www.artbytwo.com.au/index.html.

Lessons From Google Mobile

by:   |  Posted on

Google’s NYC office hosted a sold out last month’s meeting of the New York City Usability Professionals Association (NYC UPA), featuring a presentation by Leland Rechis, a UX designer in their mobile team. Exactly the sort of hyper-intelligent bespectacled geek one hopes to meet there, Rechis surveyed the key insights the UX group learned while building Google’s “mobile search product”:http://www.google.com/mobile.

Taken aback by the scale of the development effort, I began to wonder how many of the lessons learned were even relevant if you aren’t Google, or at least Google-sized. The basic problems of translating existing services and brands over to the mobile space concern many smaller organizations, but Rechis demonstrated that becoming a global mobile presence presents extraordinary challenges.

Basic problem solving still completely swamps any other creative concern when working on mobile sites. A refreshing blast of Spartan usability problems, mobile site design is uncluttered with your typical mamby-pamby web problems. Can a user get the information, and fast? Answer this question and you’re far ahead of everyone else.

The design process described was quite effective at powering through a lot of basic usability problems, but struck me as potentially ill suited to a younger project that might still be finding itself.

Here are four key points I took home:

Designing a good mobile web user experience requires seemingly endless device, location, and use-specific hacks.

Of course, rendering inconsistency on mobile browsers is worse than it ever was on desktop browsers. Variation in device input (keys, buttons, stick) and output (screen size) create dramatic UI problems that don’t seem to be going away. Plenty more issues can be traced to local carrier hegemony. Regional patterns of use develop based upon the network capability, handset manufacturer, and content formats most popular in any area. If anything, these differences seem to continue diverging, turning much of the design process into a maddening branching problem. Further customizations to the experience based upon the type of search result proved to be another key ingredient. As browsing a large feature set on a mobile device is so cumbersome, it is critical to intuit the user’s next action and place it accordingly: get directions here, buy those movie tickets, call this person.

A shallow learning curve is essential.

Each successful interaction, no matter how minor, invests the user in the application experience. Rechis stressed the importance of always placing low hanging fruit on the first screen to build user confidence. Further personalization efforts highlight only the features users have used or requested. This is good general advice, but critical in mobile applications where the UI standards are few, and the failure modes extraordinarily frustrating. It’s worth remembering that the vast majority of Americans don’t even know their phone has a browser.

Localization for the mobile experience is more complex than ever.

It was plainly apparent that we (some small subset of geeky Americans) have simply no idea what the rest of the world expects from their phones. Even with most device, network, and language issues solved, we can still be far from creating an application with any kind of cultural relevance. Basic local conventions complicate something as simple as a web search. Europeans expect a compact page with a few precise links, while many Asian consumers on high-bandwidth networks expect results as screenfuls of colorful content. Driving directions have to be formulated with wayfaring devices that are locally relevant. Familiarity with the technology varies drastically and direct research in all of these different cultural contexts is the only way to tackle any of these problems.

The UX techniques you know and love are compatible with AGILE development.

The mobile team used an AGILE process (but not all of Google necessarily). If you aren’t already using one yourself, trends say you’re likely to come across it soon. I found the way UX and usability techniques were integrated perfectly sensible and a useful validation of many of the concepts well loved on B&A.

An upfront “ideation sprint” was focused on detailing a primary use case. Weekly development sprints were coupled with weekly usability tests, constantly testing the features delivered or modified in the previous sprint. And in the most astonishing detail, the UX team actually gets to sign off on engineers’ work before each release. Progress! These little process details were valuable nuggets for anyone on similar teams.

So to wrap up

  • Weekly usability cycles good
  • AGILE fits all parts of the design/development process.
  • Global mobile site bad, unless you’re Google
  • Volunteering at your local UPA chapter also good