The Hidden History of Information Management

by:   |  Posted on

The fictional heavy-metal band Spinal Tap immortalized the “fine line between clever and stupid.” It’s a similar situation with information access: there’s a fine line between rich and broke. Put another way (by the late cognitive psychologist Hebert Simon): “a wealth of information creates a poverty of attention.”

Today the poverty of attention seems especially pressing. Technology makes it easier and cheaper to store information of all kinds, far outpacing our ability to convert that information into meaning and knowledge. On the plus side for B&A readers, this situation seems likely to keep information architects gainfully employed for some time to come.

But on a broader cultural and historical level, what strategies has society employed to collect, manage, and store information, even with the constant threat of oversupply, and still make this information accessible and meaningful to people over time?

An answer to that question—in fact, many answers—can be found in Glut: Mastering Information Through The Ages, a sweeping new book from Alex Wright about the history of the information and information management systems across disciplines, time, and even across species (bees, ants, primates, eukaryotes.)

Wright, a librarian turned writer and information architect, is no stranger to the Boxes and Arrows community, and in fact, he draws on material from two B&A articles (on IA and sociobiology, another on Belgian bibliographer Paul Otlet) in his new book, now set in a broader narrative. Glut is an informative, ambitious, and at times frustrating work, as Wright juggles three different roles in shepherding his material: tour guide, curator, and essayist.

Wright The Tour Guide

As a tour guide, Wright is a patient, well-informed, and focused narrator, exploring the roots of information systems including writing, classification schemes, books, and libraries. In this mode, his sweeping connection-making is somewhat akin to the work of science historian and BBC documentarian James Burke (a fan of Glut) in its quest for hidden connections between seemingly disparate subjects and causes.

Wright informs us at the outset that he will avoid the lure of utopian techno-futurism and excavate the story of the information age by looking “squarely backward.” Just how far backward? Two billion years ago for the information architecture practices of multi-cell organisms, and for Homo sapiens, try the Ice Age (about 45,000 years ago.) That, Wright tells us, is when our cave-dwelling ancestors started banding together for survival in the face of tougher hunting conditions.

While today we think the biggest challenge of glut is the ensuing time and attention management crunch, Glut reminds us that information acquisition did not come easy in the early days of empire building. A central challenge for many cultures was the amassing of material for that key information storehouse—the library—and trying to protect these centralized physical and intellectual assets from violent destruction:

“From ancient Sumer to India to China to the Aztec kingdom, the same pattern manifested again and again: first came literacy, then the nation-state, the empire, and ultimately the intellectual apotheosis of the empire, the library. When empires fall, they usually take their libraries with them.”

Among some of the other intriguing stops and observations along Wright’s tour:

  • Beads and pendants served as a very early symbolic communication for Ice Age Homo sapiens, allowing people to create bonds and achieve more complex social connections.
  • “Meta” text of a sort dates as far back as 2300 BC; archeologists have found 2000 tablets including lists of animals and locations as well as listing other tablets.
  • Google’s controversial book-scanning effort seems not far afield from the acquisition policy described by Wright for the Alexandrian library: “The Alexandrian rules built the great library not just as an act of imperial generosity but also through fiat, confiscation, and occasionally, subterfuge.

Wright The Curator

Part of Glut feels like an information management museum in book form, and Wright evinces a strong curatorial preference for the quixotic. There’s a sense that he hopes to shift our cultural focus from history’s hit makers to a number of lesser known but meritorious information management ideas from the past that deserve further airtime today.

For example, when Wright works his way up to recent computer history, he avoids focusing on the already well-told and well-documented human-computer interaction story of ARC, PARC, and Apple. Instead, he favors lesser-known milestones in the history of hypertext, with a fresh look at Ted Nelson and several groundbreaking experiments at Brown University (Wright’s undergraduate alma matter).

The Brown University story culminates in a project called Intermedia, which included many features that Wright finds lacking in today’s Web framework, including bi-directional linking (both pointers and targets “know” of the link), and real-time dynamic editing and updating. The project vanished for lack of federal funding in 1994, just before the World Wide Web stepped onto the global stage.

But central exhibit in this wing is Otlet, the 19th century Belgian bibliographer whom Wright dubs as the Internet’s forgotten forefather. Otlet is best known as the developer of the Universal Decimal Classification (UDC), a flexible and faceted library classification system in widespread use today worldwide across 23 languages.

Glut focuses on Otlet’s vision for something remarkably similar to today’s World Wide Web, and his efforts to realize it with a kind of manual database comprised of 12 million facts kept on index cards in an office he called the Mundaneum to which readers could submit queries for a small fee.

Otlet hoped that ultimately anyone would be able to access all human knowledge across forms—books, records films, radio, television—remotely from their own homes on multi-windowed screens, and even went so far as to the words “Web” and “links.”

Due to financial constraints and dwindling government support, Otlet found his Mundaneum squeezed into progressively smaller accommodations including a parking lot until he finally shuttered the project in 1934; a few years later, Nazi troops carted it away.

Wright argues that in some ways, Otlet’s ideas not only foretold but also surpassed the current Web: “Distinguishing Otlet’s vision… is the conviction—long since fallen out of favor—in the possibility of a universal subject classification working in concert with the mutable social forces of scholarship.”

Wright The Essayist

One of Wright’s central themes is the pas-de-deux between networks and hierarchies, and the need to balance Web 2.0’s bottom-up, technology-enabled crowd wisdom with a classic sense of the individual expertise, scholarship, and merit guided by human hand.

Decrying what he describes as the utopian view that “hierarchical systems are restrictive, oppressive vehicles of control, while networks are open democratic vehicles of personal liberation,” Wright pursues a throughline across time in which networks and hierarchies are seen not only as competitive but also as potentially complimentary and reinforcing—even essential to one another:

“Networked systems are not entirely modern phenomena, nor are hierarchal systems necessarily doomed. There is a deeper story at work here. The fundamental tension between networks and hierarchies has percolated for eons. Today we are simply witnessing the latest installment in a long evolutionary drama.”


Wright the essayist is an elusive fellow: he combines humanism and pragmatism, and eschews the received techno-hype that is coming back into vogue in the Web 2.0 era. Yet he does not seem prepared to grab the bullhorn from Wright the historian or Wright the curator. Among the arguments Wright puts forward, as best as I can tease out:

  • Google’s page-rank algorithm risks reducing the presentation of information to a popularity contest; previous models throughout information management show the possibility of a more balanced and durable approach between classification by human hands (top down) and social meaning (bottom up).
  • Today’s Web links are inferior to the bi-directional hypertext linking explored in projects at Brown and envisioned by Ted Nelson and others, in which one linked resource would “know” about other links to it. The current state of hypertext doesn’t realize its full promise of helping to navigate information overload in a way that might better help advance human knowledge.
  • Aspects of the Web’s infrastructure (other than nascent Web 2.0 tools) favor one-way consumption rather than two-way discourse, and there’s an ongoing risk of excessive control by corporate interests and unseen technology gatekeepers.

On the book’s very last page, Wright touches on Wikipedia as a modern-day meeting ground for the pull and tug between networks and hierarchies, and notes Wikipedia’s creation of a new hierarchal review process to bolster its credibility. Coming so late and remaining so brief, the discussion seems an afterthought rather than what could have been a convergence of the book’s themes.

Information architects—and anyone curious about the roots of information management—will find much of interest in Glut’s thought-provoking tale. Given the stimulating and contrarian nature of Glut’s ideas, one only wishes Wright would occasionally return from the corridors of the time tunnel and bring his well-informed perspective back to our present age.


To get deeper into the book, “read the excerpt”:



About the Book

“Glut: Mastering Information Through the Ages”:
Alex Wright
2007; Joseph Henry Press
ISBN-10: 0309102383

The Encyclopedic Revolution

by:   |  Posted on
This excerpt adapted from Chapter 9, “The Encyclopedic Revolution.”

Despite the proliferation of books in the years after Gutenberg, three hundred years later books still remained prohibitively expensive for most Europeans. By the mid-eighteenth century, a typical educated household might own at most a single book (often a popular devotional text like the Book of Hours). Only scholars, clergymen and wealthy merchants could afford to own more than a few volumes. There was no such thing as a public library. Still, writers were producing new books in ever-growing numbers, and readers found it increasingly challenging – and often financially implausible – to stay abreast of new scholarship.

At the dawn of the Age of Enlightenment, a handful of philosophers, inspired by Francis Bacon’s quest for a unifying framework of human knowledge[1], started to envision a new kind of book that would synthesize the world’s (or at least Europe’s) intellectual output into a single, accessible work: the encyclopedia. Although encyclopedias had been around in one form or another since antiquity (originating independently in both ancient Greece and China), it was only in the eighteenth century that the general-purpose encyclopedia began to assume its modern form. In 1728, Ephraim Chambers published his Cyclopedia, a compendium of information about the arts and sciences that gained an enthusiastic following among the English literati. The book eventually caught the eye of a Parisian bookseller named André Le Breton, who decided to underwrite a French translation.


Enter Denis Diderot. A prominent but financially struggling writer and philosopher, Diderot occasionally supplemented his income by translating English works into French. When Breton approached him about the Cyclopedia, he readily accepted the commission. Soon after embarking on the translation, however, he found himself entranced by the project. He soon persuaded Breton to support him in creating more than a simple translation. He wanted to turn the work into something bigger. Much bigger. He wanted to create a “universal” encyclopedia.

Adopting Bacon’s classification as his intellectual foundation, Diderot began the monumental undertaking that would eventually become the Encyclopédie ou Dictionnaire Raisonné des Sciences, des Arts et des Métiers (“Encyclopedia or Dictionary of the Sciences, the Arts, and the Professions”), published in a succession of volumes from 1751 to 1772. A massive collection of 72,000 articles written by 160 eminent contributors (including notables like Voltaire, Rousseau, and Buffon), Diderot turned the encyclopedia into a compendium of knowledge vaster than anything that had ever been published before.

Diderot did more than just survey the universe of printed books. He took the unprecedented step of expanding the work to include “folk” knowledge gathered from (mostly illiterate) tradespeople. The encyclopedia devoted an enormous portion of its pages to operational knowledge about everyday topics like cloth dying, metalwork, and glassware, with entries accompanied by detailed illustrations explaining the intricacies of the trades. Traditionally, this kind of knowledge had passed through word of mouth from master to apprentice among the well-established trade guilds. Since most of the practitioners remained illiterate, almost none of what they knew had ever been written down – and even if it had, it would have held little interest for the powdered-wig habitués of Parisian literary salons. Diderot’s encyclopedia elevated this kind of craft knowledge, giving it equal billing with the traditional domains of literate scholarship.

Figurative System of Human Knowledge

While publishing this kind of “how-to” information may strike most of us today as an unremarkable act, in eighteenth-century France the decision marked a blunt political statement. By granting craft knowledge a status equivalent to the aristocratic concerns of statecraft, scholarship, and religion – Diderot effectively challenged the legitimacy of the aristocracy. It was an epistemological coup d’ étate.

Diderot’s editorial populism also found expression in passages like this one: “The good of the people must be the great purpose of government. By the laws of nature and of reason, the governors are invested with power to that end. And the greatest good of the people is liberty.” To the royal and papal authorities of eighteenth century France, these were not particularly welcome sentiments. Pope Clement XIII castigated Diderot and his work (in part because Diderot had chosen to classify religion as a branch of philosophy). King George III of England and Louis XV of France also condemned it. His publisher was briefly jailed. In 1759 the French government ordered Diderot to cease publication, seizing 6,000 volumes, which they deposited (appropriately enough) inside the Bastille. But it was too late.

By the time the authorities came after Diderot’s work, the encyclopedia had already found an enthusiastic audience. By 1757 it had attracted 4000 dedicated subscribers (no small feat in pre-industrial France). Despite the official ban, Diderot and his colleagues continued to write and publish the encyclopedia in secret, and the book began to circulate widely among an increasingly restive French populace.


Diderot died 10 years before the revolution of 1789, but his credentials as an Enlightenment encyclopedist would serve his family well in the bloody aftermath. When his son-in-law was imprisoned during the revolution and branded an aristocrat, Diderot’s daughter pleaded with the revolutionary committee, citing her father’s populist literary pedigree. On learning of the prisoner’s connection to the great encyclopedist, the committee immediately set him free.

What can we learn from Diderot’s legacy today? His encyclopedia provides an object lesson in the power of new forms of information technology to disrupt established institutional hierarchies. In synthesizing information that had previously been dispersed in local oral traditions and trade networks, Diderot created a radically new model for gathering and distributing information that challenged old aristocratic assumptions about the boundaries of scholarship – and in so doing, helped pave the way for a revolution.

Today, we are witnessing the reemergence of the encyclopedia as a force for radical epistemology. In recent years, Wikipedia’s swift rise to cultural prominence seems to echo Diderot’s centuries-old encyclopedic revolution. With more than three million entries in more than 100 languages, Wikipedia already ranks as by far the largest (and most popular) encyclopedia ever created. And once again, questions of authority and control are swirling. Critics argue that Wikipedia’s lack of quality controls leaves it vulnerable to bias and manipulation, while its defenders insist that openness and transparency ensure fairness and ultimately will allow the system to regulate itself. Just as in Diderot’s time, a deeper tension seems to be emerging between the forces of top-down authority (manifesting as journalists, publishers and academic scholars) and the bottom-up, quasi-anarchist ethos of the Web. And while no one has yet tried to lock Wikipedia up in the Bastille, literary worthies and assorted op-ed writers have condemned the work in sometimes vicious terms, while the prophets of techno-populism celebrate its arrival with an enthusiasm often bordering on zealotry. Once again, the encyclopedia may prove the most revolutionary “book” of all.

About the Book

“Glut: Mastering Information Through the Ages”:
Alex Wright
2007; Joseph Henry Press
ISBN-10: 0309102383


fn1. cf. Bacon’s Novum Organum of 1620

Success Stories

by:   |  Posted on

Success is a difficult thing. What exactly does it mean? Rising to the top, or getting what you want? Having respect for your achievements? Whatever it means, it’s a regular expression in The Netherlands. You know, that funny place sometimes referred to as Holland, where, as they say goodbye, they wave and say, ‘Success!’ Now, I’ve seen it happen occasionally in other places, but never with the same degree of bitter humor or comical irony. Whatever it actually means, the Dutch seem to suggest, ‘Success… it’s a new thing.’

The Dutch are, historically, very good designers, seeing design as a facet of their culture. Like architecture, design is a public necessity and a purveyor of improvement (or ironic comments on improvement). So, when something becomes improved, like the design of an interface, it is a success, but it’s still only a stepping stone to the next improvement. This idea hints at the problem with success stories. They capture the moment very well, but lead to the feeling that you have reached the end of the improvement, when quite regularly it is the opposite–you have simply just stepped a little farther towards a relatively unknown goal.

Designing Interactions by Bill Moggridge[1] does an excellent job of revealing the people and the work behind many of the most important interactive products of our time and discussing their impact on the field of interaction design. Continue reading Success Stories

Design Is Rocket Science

by:   |  Posted on

I remember reading those Scientific American magazines when I was a kid. I liked them because the design of the magazine was funky, almost a 50’s image brought into the 80’s. It had a flair for interjecting human qualities, humor, lifestyle issues, even cosmetic thinking, in a way that no other ‘serious magazine’ really did. I, like so many other people, did not read it or even just look through it, for the amazing scientific breakthroughs that they reported, but because it was well designed. So, for me, it wasn’t a science magazine, it was good design, and that was rocket science.

“Rocket Science” is one of those expressions that conjures up a lot of thoughts, but mostly it means something is incredibly smart, basically breaching the impossible. Now, I find “The Impossible” breathtakingly exciting, the idea of something not being able to happen just somehow thrills me to bits. For example, it really makes me tick that it’s practically impossible to design a reasonably easy to use, or aesthetically interesting, computer interface. But, there are a thousand good suggestions on how to get started on such an endeavor this in this book.
“Interaction Design: Beyond Human-Computer Interaction”: [1] is cunningly released at a time when acceptance of Interaction Design as a discipline is reaching a critical mass. The book precipitates a huge turn in the creation of interactive technologies toward the more research/creative or human-centric model, approaching the subject of this change from different angles and illuminating historical insights.

Continue reading Design Is Rocket Science

Demolition Derby

by:   |  Posted on

Before I even opened this book, I had three reasons to like it. First, Scott Berkun is “one of us”. As a former Microsoft project manager responsible for overseeing early versions of Internet Explorer, he has a strong background in usability, information architecture, and design. His first book, “The Art of Project Management“: (also “reviewed”: on Boxes and Arrows), might have been more appropriately titled, The Art of Project Management for Design-Intensive Projects. You might also know Berkun as the creator of the “Interactionary design contests”: held at “ACM’s SIGCHI conferences”: He comes from our world and many of his examples are drawn from individuals and organizations familiar to the IA and UX communities. Second, on a more personal level, the book includes two of my photos: See the title pages for chapters 5 and 6. The inclusion of these photos resulted from a request on “Berkun’s blog”: calling for Flickr-based photos, with two being plucked from my current collection of 3000+ images. Very exciting! Finally, The Myths of Innovation is a short, light book and a handy airplane read. Enough said.

The importance of innovation

Innovation is a hot topic at the moment. Actually, innovation has been a big thing for last hundred years or more, but perhaps we needed the profusion of business magazines and books to bring this observation into sharp focus. With the tech sector on the ascendancy (again), driven in part by the Web 2.0 movement, examples of innovation are everywhere. We’ve moved beyond the notion of the knowledge economy to recognize that innovative ideas can be the foundation for disruptive business models. This factor makes Berkun’s book timely, as it sheds light on the underpinning truths that surround innovation. This is what the dust jacket promises:

In The Myths of Innovation, bestselling author Scott Berkun takes a careful look at innovation history, including the software and Internet ages, to reveal how ideas truly become successful innovations–truths that you can apply to today’s challenges.
Using dozens of examples from the history of technology, business, and the arts, you’ll learn how to convert the knowledge you have into ideas that can change the world.

So, does it deliver?

Debunking myths

To explain how innovation works, Berkun starts in the opposite direction and first exposes ten commonly-held beliefs about innovation:
1. The myth of epiphany
2. We understand the history of innovation
3. There is a method for innovation
4. People love new ideas
5. The lone inventor
6. Good ideas are hard to find
7. Your boss knows more about innovation than you
8. The best ideas win
9. Problems and solutions
10. Innovation is always good

In each chapter a myth is introduced and then progressively unraveled and debunked with great wit and charm. This approach helps to structure the book and it offers an easy way to explore innovation. Berkun has a fluid writing style and finds the right balance between informality and powerful word-smithing.

Berkun uses a range of examples from the Renaissance to eBay and Craigslist. Each case study is carefully researched and accompanied by footnotes pointing to further reading. In many instances, Berkun takes unexpected angles on historical cases, presenting new perspectives on stories that have been told and retold for more than a generation. For example, most people are familiar with the story of Post-it notes: The 3M miracle product that evolved from a glue that didn’t stick properly. Far fewer know about the product that preceded Post-it notes (masking tape), and the company’s corporate history. 3M actually stands for Minnesota Mining and Manufacturing and the company started out drilling for underground minerals to manufacture grinding wheels. It was only after a lab assistant needed a way to mark borders for two-tone car painting that masking tape was developed and the rest became history. Another example explores the challenges in getting the telegraph adopted and how the company built on that discovery, Western Union, eventually became the protector of the status quo when new innovations came along–namely the telephone.

Through these examples, Berkun demonstrates that while inventions seem inevitable after the fact, the path to adoption is almost never certain. Great ideas fail, while commercial imperatives drive the success of other innovations.

Providing answers

Readers looking for an innovation checklist or a how-to book will be dissatisfied. One of the myths that Berkun debunks is that there can be a step-by-step guide to innovation. Instead, innovation is a complicated and unpredictable process with many paths–more jigsaw puzzle than a straight line. By its nature, innovation explores uncharted territory. It is also the product of a lot hard work, unexpected insights, the collaboration of many individuals, and sheer, random chance.

When I reached the end of the book, I was disappointed to discover there was not a summary chapter wrapping up its message; something akin to, “So therefore, based on these myths, this is how you need to do innovation in practice.” While a concluding chapter would have neatly closed the narrative arc at the end of the book, Berkun was right not have included one. Instead, the onus is on the reader to review the book again and allow the many gems scattered throughout the text sink in more.

In particular, Berkun outlines a number of key principles and barriers to innovation. They are presented in unassuming lists that belie their value. For example, he outlines eight challenges all innovations must confront and overcome, including sponsorship and funding, capacity for reproduction, and reaching the potential customer. In addition to these challenges, Berkun discusses elements that can influence the speed of adoption, challenges associated with managing innovation, and factors that have influenced historical innovations. Berkun also offers a comprehensive set of checkpoints that can be used to assess approaches to innovation.

What we can learn

There are many heroes idolized within our industry, whether it’s Flickr, eBay, Craigslist, 37 Signals, IDEO, Yahoo, Google, or any of the hundreds of Web 2.0 businesses. All of these organizations are regarded as paragons of innovation, featured prominently at conferences and in case studies. Berkun points out that while much can be learned from these organizations, the myths that surround them can also blindly lead us down the wrong path. If we recreate the funky, fun-filled spaces of the Googleplex, do we automatically become innovative? If we develop functionalities that mimic Flickr, will we be able to take on the world?

When starting down the path of innovation, we must do more than just blindly copy the formulas so neatly captured and communicated from these leading companies. Yes, we would like some measure of their success, but we would do better to learn from the myths outlined in this book. When we are establishing our design teams, building our startups, or consolidating our consulting firms, we need to consider the ideas presented in The Myths of Innovation. The lessons I took away from the book include the following:

  • Good management has a huge impact on the success of in-house innovation.
  • Innovation is paired with collaboration.
  • The best outcomes derive from a mix of self-awareness and the ability to recognize and explore opportunities when they arise.
  • Oh, and the need for perseverance, no matter how hard the road ahead.

The universal principles and insights captured by Berkun certainly apply to design and user testing. On page 66, Berkun makes the following observation:

“[Innovators] grow so focused on creating that they forget that those innovations are good only if people can use them. While there’s a lot to be said for raising bars and pushing envelops, breakthroughs happen for societies when innovations diffuse, not when they remain forever ahead of their time.”

Information architects, therefore, have an important role to play in innovation, particularly when making use of ethnographic research techniques. At the end of the day, we don’t win awards for demonstrating how smart or creative we are if no one chooses to make use of our wonderful new innovations. The more we understand our users or customers, the better we’ll be able to create innovations that make their lives easier. Innovation doesn’t happen in isolation, nor is it the result of being struck by a falling apple (or even a falling Apple?). Innovation occurs in the real world, drawn from an understanding of needs, and delivered through a design process that makes the idea into something that will change the world. This is where IAs can contribute.


I started The Myths of Innovation in a positive frame of mind, generated by my interest in the topic (and the excitement of seeing my photos in print). I ended the book similarly enthusiastic. While it isn’t a long read (I started in Cambridge and finished before I touched down in Los Angeles), good books don’t need a lot of words to make their point. Scott Berkun clearly presents his arguments, demolishing many of the misconception about innovation. For those of us running businesses or developing new products, it’s a must-read.

About the Book
“The Myths of Innovation”:
Scott Berkun
2007, O’Reilly
ISBN-10: 0596527055

Authors note: If you want to view more of my book-worthy photos, you can find them on “Flickr”:, or on the site from “my first photography exhibition”: