The Encyclopedic Revolution

by:   |  Posted on
This excerpt adapted from Chapter 9, “The Encyclopedic Revolution.”

Despite the proliferation of books in the years after Gutenberg, three hundred years later books still remained prohibitively expensive for most Europeans. By the mid-eighteenth century, a typical educated household might own at most a single book (often a popular devotional text like the Book of Hours). Only scholars, clergymen and wealthy merchants could afford to own more than a few volumes. There was no such thing as a public library. Still, writers were producing new books in ever-growing numbers, and readers found it increasingly challenging – and often financially implausible – to stay abreast of new scholarship.

At the dawn of the Age of Enlightenment, a handful of philosophers, inspired by Francis Bacon’s quest for a unifying framework of human knowledge[1], started to envision a new kind of book that would synthesize the world’s (or at least Europe’s) intellectual output into a single, accessible work: the encyclopedia. Although encyclopedias had been around in one form or another since antiquity (originating independently in both ancient Greece and China), it was only in the eighteenth century that the general-purpose encyclopedia began to assume its modern form. In 1728, Ephraim Chambers published his Cyclopedia, a compendium of information about the arts and sciences that gained an enthusiastic following among the English literati. The book eventually caught the eye of a Parisian bookseller named André Le Breton, who decided to underwrite a French translation.


Enter Denis Diderot. A prominent but financially struggling writer and philosopher, Diderot occasionally supplemented his income by translating English works into French. When Breton approached him about the Cyclopedia, he readily accepted the commission. Soon after embarking on the translation, however, he found himself entranced by the project. He soon persuaded Breton to support him in creating more than a simple translation. He wanted to turn the work into something bigger. Much bigger. He wanted to create a “universal” encyclopedia.

Adopting Bacon’s classification as his intellectual foundation, Diderot began the monumental undertaking that would eventually become the Encyclopédie ou Dictionnaire Raisonné des Sciences, des Arts et des Métiers (“Encyclopedia or Dictionary of the Sciences, the Arts, and the Professions”), published in a succession of volumes from 1751 to 1772. A massive collection of 72,000 articles written by 160 eminent contributors (including notables like Voltaire, Rousseau, and Buffon), Diderot turned the encyclopedia into a compendium of knowledge vaster than anything that had ever been published before.

Diderot did more than just survey the universe of printed books. He took the unprecedented step of expanding the work to include “folk” knowledge gathered from (mostly illiterate) tradespeople. The encyclopedia devoted an enormous portion of its pages to operational knowledge about everyday topics like cloth dying, metalwork, and glassware, with entries accompanied by detailed illustrations explaining the intricacies of the trades. Traditionally, this kind of knowledge had passed through word of mouth from master to apprentice among the well-established trade guilds. Since most of the practitioners remained illiterate, almost none of what they knew had ever been written down – and even if it had, it would have held little interest for the powdered-wig habitués of Parisian literary salons. Diderot’s encyclopedia elevated this kind of craft knowledge, giving it equal billing with the traditional domains of literate scholarship.

Figurative System of Human Knowledge

While publishing this kind of “how-to” information may strike most of us today as an unremarkable act, in eighteenth-century France the decision marked a blunt political statement. By granting craft knowledge a status equivalent to the aristocratic concerns of statecraft, scholarship, and religion – Diderot effectively challenged the legitimacy of the aristocracy. It was an epistemological coup d’ étate.

Diderot’s editorial populism also found expression in passages like this one: “The good of the people must be the great purpose of government. By the laws of nature and of reason, the governors are invested with power to that end. And the greatest good of the people is liberty.” To the royal and papal authorities of eighteenth century France, these were not particularly welcome sentiments. Pope Clement XIII castigated Diderot and his work (in part because Diderot had chosen to classify religion as a branch of philosophy). King George III of England and Louis XV of France also condemned it. His publisher was briefly jailed. In 1759 the French government ordered Diderot to cease publication, seizing 6,000 volumes, which they deposited (appropriately enough) inside the Bastille. But it was too late.

By the time the authorities came after Diderot’s work, the encyclopedia had already found an enthusiastic audience. By 1757 it had attracted 4000 dedicated subscribers (no small feat in pre-industrial France). Despite the official ban, Diderot and his colleagues continued to write and publish the encyclopedia in secret, and the book began to circulate widely among an increasingly restive French populace.


Diderot died 10 years before the revolution of 1789, but his credentials as an Enlightenment encyclopedist would serve his family well in the bloody aftermath. When his son-in-law was imprisoned during the revolution and branded an aristocrat, Diderot’s daughter pleaded with the revolutionary committee, citing her father’s populist literary pedigree. On learning of the prisoner’s connection to the great encyclopedist, the committee immediately set him free.

What can we learn from Diderot’s legacy today? His encyclopedia provides an object lesson in the power of new forms of information technology to disrupt established institutional hierarchies. In synthesizing information that had previously been dispersed in local oral traditions and trade networks, Diderot created a radically new model for gathering and distributing information that challenged old aristocratic assumptions about the boundaries of scholarship – and in so doing, helped pave the way for a revolution.

Today, we are witnessing the reemergence of the encyclopedia as a force for radical epistemology. In recent years, Wikipedia’s swift rise to cultural prominence seems to echo Diderot’s centuries-old encyclopedic revolution. With more than three million entries in more than 100 languages, Wikipedia already ranks as by far the largest (and most popular) encyclopedia ever created. And once again, questions of authority and control are swirling. Critics argue that Wikipedia’s lack of quality controls leaves it vulnerable to bias and manipulation, while its defenders insist that openness and transparency ensure fairness and ultimately will allow the system to regulate itself. Just as in Diderot’s time, a deeper tension seems to be emerging between the forces of top-down authority (manifesting as journalists, publishers and academic scholars) and the bottom-up, quasi-anarchist ethos of the Web. And while no one has yet tried to lock Wikipedia up in the Bastille, literary worthies and assorted op-ed writers have condemned the work in sometimes vicious terms, while the prophets of techno-populism celebrate its arrival with an enthusiasm often bordering on zealotry. Once again, the encyclopedia may prove the most revolutionary “book” of all.

About the Book

“Glut: Mastering Information Through the Ages”:
Alex Wright
2007; Joseph Henry Press
ISBN-10: 0309102383


fn1. cf. Bacon’s Novum Organum of 1620

Forgotten Forefather: Paul Otlet

by:   |  Posted on

One rainy afternoon in 1968, a young Australian graduate student named Boyd Rayward stepped into an abandoned office in the Parc Leopold in Brussels, Belgium. Inside, he discovered “a cluttered, musty, cobwebbed office into which the rain leaked—and one day flooded—causing the attendant then on hand to have a kind of epileptic seizure.”1 Piled high to the ceiling were dusty stacks of books, files and manuscripts: the intellectual flotsam of a seemingly disorganized old scholar.

The previous occupant, Paul Otlet, had been dead for nearly twenty-five years. A bibliographer, pacifist and entrepreneur, Otlet had in his heyday been feted as a great man, enjoying the company of Nobel laureates and even playing a role in the formation of the League of Nations. But by the time of his death in 1944, he had lived long enough to see his reputation fade to near-obscurity, seen his greatest ambition fail, and suffered the final humiliation of watching the Nazis cart away and destroy much of his life’s work. When he finally died a few months before the end of the war, hardly anyone noticed.

Who was Paul Otlet? Meet the forgotten forefather of information architecture.

The web that wasn’t

In 1934, years before Vannevar Bush dreamed of the memex, decades before Ted Nelson coined the term “hypertext,” Paul Otlet envisioned a new kind of scholar’s workstation: a moving desk shaped like a wheel, powered by a network of hinged spokes beneath a series of moving surfaces. The machine would let users search, read and write their way through a vast mechanical database stored on millions of 3×5 index cards.2

This new research environment would do more than just let users retrieve documents; it would also let them annotate the relationships between one another, “the connections each [document] has with all other [documents], forming from them what might be called the Universal Book.”3

Otlet imagined a day when users would access the database from great distances by means of an “electric telescope” connected through a telephone line, retrieving a facsimile image to be projected remotely on a flat screen.

In Otlet’s time, this notion of networked documents was still so novel that no one had a word to describe these relationships, until he invented one: “links.”

Otlet envisioned the whole endeavor as a great “réseau”—web—of human knowledge.

The Universal Decimal Classification

Although generations of philosophers had tried to solve the problem of classifying human knowledge—including Bacon, Wilkens, and Linnaeus—it was not until the middle of the 19th century that the problem came to a practical head. The industrialization of the printing business, coupled with the advent of cheap binding materials, spurred an explosion in publishing no less disruptive than the advent of Gutenberg’s press 400 years earlier.

Faced with an onslaught of new texts, nineteenth century scholars and bibliographers began to wrestle again with the problem of classification. Catalogers like Panizzi, Dewey and Ranganathan all devised elaborate subject schemes, laying the foundations of modern library and information science.

In 1895, Otlet and Henri La Fontaine established the Repertoire Bibliographique Universel (RBU), an ambitious attempt at developing a master bibliography of the world’s accumulated knowledge. Otlet recognized from the beginning that the success of the whole undertaking would depend largely on the usefulness of its conceptual software, the classification system.

After evaluating the classification systems then in use, such as Dewey Decimal and the British Museum system, Otlet concluded that they all shared a fatal flaw: they were designed to guide readers as far as the individual book—but no further. Ranganathan had voiced the ethos of modern cataloging when he said: “every reader his or her book, and every book its reader.” But once book and reader were matched, they were left pretty well to their own devices.

Otlet wanted to go a step further. He wanted to penetrate the boundaries of the books themselves, to unearth the “substance, sources and conclusions” inside.

Taking the Dewey Decimal system as his starting point, Otlet began developing what came to be known as the Universal Decimal Classification, now widely recognized as the first—and one of the only—full implementations of a faceted classification system.

While Ranganathan rightly receives credit as the philosophical forbear of facets, Otlet was the first to put them to practical use.

Facets of the Universal Decimal Classification

Facts: Empirical observations or assertions.
Interpretation: Analysis or conclusions, derived from “facts.”
Statistics: Measured, quantifiable data.
Sources: Citations or references.

Today, the UDC comprises over 62,000 individual classifications, translated into over 30 languages (one reason for its popularity outside the U.S.). The UDC’s current top-level classes include:

0 Generalities. Science, knowledge, organization, computer science
1 Philosophy. Psychology
2 Religion. Theology
3 Social sciences. Law
4 [Under development]
5 Mathematics and natural sciences
6 Applied sciences. Medicine. Technology
7 The arts. Recreation. Entertainment. Sport
8 Language. Linguistics. Literature
9 Geography. Biography. History

So, for example,

004 Computer science
004.8 Artificial intelligence
004.89 Artificial intelligence application systems
004.891 Expert systems
004.891.2 Consultation expert systems4

In addition to the so-called Main Tables of subject headings, UDC also supports a series of Auxiliary Tables allowing for the addition of facets. These tables provide notations for place, language, physical characteristics, and for marking relationships between topics using a set of “connector” signs such as “+,” “/” and “:”.

The UDC’s capacity for mapping relationships between ideas—for constructing the “social space” of a document—provides a dimension of use not supported in other purely topical classification schemes. As the Universal Decimal Classification Consortium puts it:

UDC’s most innovative and influential feature is its ability to express not just simple subjects but relations between subjects … In UDC, the universe of information (all recorded knowledge) is treated as a coherent system, built of related parts, in contrast to a specialised classification, in which related subjects are treated as subsidiary even though in their own right they may be of major importance.5

The Mundaneum

In 1910, in the wake of the Brussels world’s fair, Otlet and LaFontaine created an installation at the Palais du Cinquantenaire of the Palais Mondial.

Originally envisioned as the centerpiece of a new “city of the intellect,” the Mundaneum was to be the hub of a utopian city that housed a society of the world’s nations.

In 1919, shortly after the end of World War I, Otlet successfully lobbied King Albert and the Belgian government to furnish a new home for the Mundaneum, taking over 150 rooms in Brussels’ Cinquantenaire. At the time, not coincidentally, Belgium was lobbying to host the nascent League of Nations’s new headquarters. Hoping to help his country take center stage in wooing the new organization, Otlet pitched his project as the centerpiece of a new “world city.” Inside the new Mundaneum, he began to assemble his vast “documentary edifice,” eventually comprising over 12 million individual index cards and documents.

At the time, the 3×5 index card represented the latest advance in information storage technology: a standardized, easily manipulated vessel for housing individual nuggets of data. So, Otlet’s réseau began taking shape in the form of an enormous collection of index cards, filed away in a sprawling array of cabinets.

The effort met with early success, even attracting a healthy business in mail-order research services, in which users would submit search queries for a fee (27 francs per 1000 cards retrieved). The service attracted over 1500 requests a year on subjects from boomerangs to Bulgarian finance.6

But by 1924, the Belgian government had lost patience with the project, especially after the newly formed League of Nations chose Geneva over Brussels as its headquarters—and thus robbing the Mundaneum of one of its primary raisons d’etre. Otlet had to relinquish his original location, moving the Mundaneum to succession of smaller quarters, even landing briefly, ignominiously, in a parking garage. After a series of fiscal struggles and management missteps, Otlet finally faced the difficult but unavoidable choice of shutting down operations in 1934. A few years later, Nazi troops came and carted away the remnants—to make way for an exhibition of Third Reich art.

After Otlet’s death in 1944, what survived of the original Mundaneum was left to languish in an old anatomy building of the Free University in the Parc Leopold, all but forgotten. Over the ensuing half-century, more than 70 tons’ worth of its original contents were destroyed. Finally, in the mid-1990s, a group of volunteers began resurrecting Otlet’s original vision, hoping to preserve and refurbish the original Mundaneum.

In 1996, the new Mundaneum opened in Mons, Belgium, serving primarily as a museum to preserve Otlet’s legacy and his vision of the “Universal Book.” While today’s Mundaneum serves primarily as a museum and learning center, rather than as a working incarnation of Otlet’s original plan, the new institution does an admirable job of perpetuating his legacy, and reminding us of Otlet’s premonitory vision of a worldwide networked information environment.

The Traité

Windows Media Player (983KB)
Quicktime (3.2MB)

In a bitter irony, the Mundaneum’s 1934 closure coincided almost exactly with the publication of Otlet’s masterwork, the Traité de documentation, a manifesto crystallizing 40 years’ worth of writing and research into the possibilities of networked information structures.

Otlet biographer Boyd Rayward describes the Traité as ”perhaps the first systematic, modern discussion of general problems of organising information.”7

With the faceted philosophy of the UDC as backdrop, the Traité posited a universal “law of organization” declaring that no document could be properly understood by itself, but that its meaning becomes clarified through its influence on other documents, and vice versa. “[A]ll bibliological creation,” he said, “no matter how original and how powerful, implies redistribution, combination and new amalgamations.”8

While that sentiment may sound postmodernist in spirit, Otlet was no semiotician; rather, he simply believed that documents could best be understood as three-dimensional,9 with the third dimension being their social context: their relationship to place, time, language, other readers, writers and topics. Otlet believed in the possibility of empirical truth, or what he called “facticity”—a property that emerged over time, through the ongoing collaboration between readers and writers. In Otlet’s world, each user would leave an imprint, a trail, which would then become part of the explicit history of each document.

Vannevar Bush and Ted Nelson would later voice strikingly similar ideas about the notion of associative “trails” between documents. Distinguishing Otlet’s vision from the Bush-Nelson (and Berners-Lee) model is the conviction—long since fallen out of favor—in the possibility of a universal subject classification working in concert with the mutable social forces of scholarship.

Otlet’s vision suggests an intellectual cosmos illuminated both by objective classification and by the direct influence of readers and writers: a system simultaneously ordered and self-organizing, and endlessly re-configurable by the individual reader or writer.

Does Otlet still matter?

Jorge Luis Borges’ fictional Library of Babel was a place containing “all the possible combinations of the twenty-odd orthographical symbols … the translation of every book in all languages, the interpolations of every book in all books.”10

For Borges, the universal library was a literary conceit, but for Otlet it was an achievable dream: an “edifice containing all the books and the information together with all the resources of space needed to record and manage them.”11

Otlet also recognized the practical importance of “search and retrieval performed by an appropriately qualified permanent staff.” Substitute the word “Google” for “permanent staff,” and Otlet’s vision starts sounding a lot like the World Wide Web.

While it would be an exaggeration to claim that Otlet exerted a direct influence on the later development of the Web, it would be no exaggeration to say that he anticipated many of the problems we find ourselves grappling with: the explosion of published information, the limitations of current delivery and storage mechanisms, the desperate need for a classificatory framework to help us store, manage and interpret humanity’s collective intellectual capital—and, perhaps, the limits of self-organizing systems.

In the Web’s current incarnation, individual “authors” (meaning both people and institutions) maintain fixed documents, over which they exert direct control. Each document is essentially a fait accompli, with its own self-determined set of relationships to other documents. It takes a meta-application like Google or Yahoo! to discover the broader relationships between documents (usually through some combination of syntax, semantics and reputation). But those relationships, however sophisticated the algorithm used to determine them, remain largely unexposed to the end user, never becoming an explicit part of the document’s history.

Would Otlet’s Web have turned out any differently? We may yet find out. With the advent of the Semantic Web and related technologies like RDF/RSS, FOAF, and ontologies, we are moving towards an environment where social context is becoming just as important as topical content. Otlet’s vision holds out a tantalizing possibility: marrying the determinism of facets with the relativism of social networks.

In Otlet’s last book, Monde, he articulated a final vision of the great “réseau” that might as well serve as his last word:

Everything in the universe, and everything of man, would be registered at a distance as it was produced. In this way a moving image of the world will be established, a true mirror of his memory. From a distance, everyone will be able to read text, enlarged and limited to the desired subject, projected on an individual screen. In this way, everyone from his armchair will be able to contemplate creation, as a whole or in certain of its parts.12


  1. Rayward, “The Case of Paul Otlet, Pioneer of Information Science, Internationalist, Visionary: Reflections on Biography,” Journal of Librarianship and Information Science 23 (September 1991):135-145
  2. Rayward, “Visions of Xanadu: Paul Otlet (1868-1944) and Hypertext,” JASIS 45 (1994):235-250
  3. Otlet 1934 quoted in Rayward 1994
  4. Universal Decimal Classification Consortium, UDC flyer
  5. Universal Decimal Classification Consortium, “About the UDC.”
  6. Rayward, “Visions of Xanadu”
  7. Otlet quoted in Day, “Paul Otlet’s Book and the Writing of Social Space “
  8. Buckland, “Information Retrieval of More than Text” JASIS 42, 586-588
  9. Rayward, “Anticipating the Digital World: Paul Otlet and his Paper Internet”
  10. Borges, “The Library of Babel” in Labyrinths, p. 54
  11. Otlet, Traité de Documentation
  12. Otlet, Monde, pp. 390-391


Borges, Jorge Luis. “The Library of Babel,” in Labyrinths. New Directions, 1962. pp. 51-58.

Buckland. Michael. Information retrieval of more than text. Journal of the American Society for Information Science 42 (1991): 586-588

Day, Ron. “Paul Otlet’s Book and the Writing of Social Space.“ Journal of the American Society for Information Science, April 1997.

Otlet, Paul. Traite de documentation. Brussels: Editiones Mundaneum, 1934.
Otlet, Paul. Monde: essai d’universalisme: connaissance du monde, sentiment du monde, action organisée et plan du monde. Brussels: Editiones Mundaneum, 1935.
Rayward, W. Boyd. “The Case of Paul Otlet, Pioneer of Information Science, Internationalist, Visionary: Reflections on Biography,” in Journal of Librarianship and Information Science 23 (September 1991):135-145.

Rayward, W. Boyd. “Anticipating the Digital World: Paul Otlet and his Paper Internet,” Bartels Lecture at the University of Leeds, 2002.

Rayward, W. Boyd. 1994. Visions of Xanadu: Paul Otlet (1868-1944) and hypertext. Journal of the American Society for Information Science 45 (1994): 235-250.

Rayward, W. Boyd. 2002. “Anticipating the Digital World: Paul Otlet and his Paper Internet,” Bartels Lecture at the University of Leeds.

Universal Decimal Classification Consortium flyer.

Universal Decimal Classification Consortium, “About the UDC.” Resources

The Mundaneum, Mons, Belgium

Michael Buckland’s Paul Otlet page

Universal Decimal Classification Consortium Thanks

Thanks to Boyd Rayward, Francoise Levie and Stephanie Manfroid for their input and encouragement.

Images courtesy of the Mundaneum, centre d’archives, Mons, BelgiumAlex Wright is a writer, information architect, and former librarian who lives and works in San Francisco. He maintains a personal web site at

The Sociobiology of Information Architecture

by:   |  Posted on
“To approach information architecture from a purely anthrocentric perspective is to overlook the lessons of billions of years’ worth of evolutionary history.”Pity the poor prokaryote.

Born blind, deaf, and mute, shuffling around in the darkness at 30 miles per hour, grasping for food, searching for mates, the life of your average bacteria (or any of the several trillion single-cell organisms on the planet) is invariably nasty, brutish, and short.

Be glad you’re a eukaryote. Like amoeba, insects, chimpanzees, and every other form of complex animal life, we enjoy not only the polymorphous pleasures of multi-cellularity, but also a singular gift, one that distinguishes us from all other known life forms: the ability to share knowledge with each other.

John Locke famously argued that “beasts abstract not.” But in recent years, breakthrough research by sociobiologists and evolutionary psychologists suggests otherwise: that not only do many of our fellow “beasts” abstract, but they have also developed surprisingly sophisticated and highly variegated mechanisms for managing information.

When most of us talk about “information architecture,” we seem to situate ourselves within strictly humanist reference points like graphic design or library science (with perhaps a perfunctory nod to journalism, cognitive psychology, or semiotics).

To approach information architecture from a purely anthrocentric perspective, however, is to overlook the lessons of billions of years’ worth of evolutionary history. We are by no means the first species to grapple with the basic problems of what we now call information architecture: how to acquire knowledge in social groups, how to get the right information to the right party at the right time, how to distill meaning from raw data.

Much as we may like to think of ourselves as belonging to a uniquely privileged species, the fact is that every complex organism on this planet is engaged in a shared struggle with information overload.

As information architects (and human beings) we should be careful of presuming that all our present quandaries surfaced only in the past few years—or, for that matter, in the last few thousand years of recorded human history (a comparative millisecond on the evolutionary clock).

Long before anyone was looking for “godfathers” of information architecture, our fellow species were wrestling with some of the same problems we face today. The real godfathers of information architecture, as it turns out, emerged a very long time ago with the earliest origins of life on this planet.

The memory “switch”

Let’s dial back in time to a hot wet Tuesday in, say, 2,200,000 B.C. Swimming in the briny planetary sea, we find the earliest recognizable living organisms: our aforementioned friends, the prokaryotes.

Now by this time, prokaryotes had been sloshing around in the ooze for something like a billion years. Then, about 2 billion years ago—whether by dint of divine impetus or happy cosmic accident—something remarkable happened: These formerly independent organisms started to collaborate.

To make a long story short: Networks of formerly independent bacteria began forming loose collectives—sharing labor, food, and increasingly deploying specialized cells to complete certain discrete tasks. Eventually, these loosely affiliated organic teams began replicating in tandem, taking on a more persistent form as they became the earliest multi-cellular organisms.

Eukaryotic cells took shape as “host” bacteria started allowing other bacteria to take up residence within them, gradually conscripting the simpler, formerly independent prokaryotes into service. Eventually, these new bacteria began to reproduce in tandem with the host bacteria, forming a replicable organism that evolved into successively more complex life forms—with increasingly specialized tasks.

These early eukaryotes—close cousins of present-day amoeba or slime molds—learned to sense and respond to environmental conditions, adapting cells and forming new cells in response to incoming data from the surrounding environment. One cell would capture a sensory impression and relay it through adjacent cells across its immediate network, tripping amino acid “switches” to signify changes in the environment.

Maverick scientist Howard Bloom has theorized that the advent of multi-cellularity marked the birth of a nascent “global brain,” a worldwide neural network that would continue to grow and evolve for the next two million years. “Informationally linked microorganisms possessed a skill exceeding the capacities of any supercomputer from Cray Research or Fujitsu,” writes Bloom. “Ancient bacteria” had mastered the art of worldwide information exchange.

Meet the original information architects.

Social networks 1.0

3_socialnetworksLet’s fast-forward another 1.5 billion years to a rainy Thursday in the Pleistocene. In one of those rare bursts of evolutionary activity (what Stephen Jay Gould famously called “punctuated equilibrium”), a sudden wave of life forms is taking shape during the so-called Cambrian Explosion.

It took a billion years for species to evolve to the point where complex multi-cellular organisms—like shellfish—could emerge. With increasingly elaborate networks of interdependent organs—mouths, hearts, legs, and so forth—individual organisms began to comprise a trillion cells or more.

As life forms became more complex, they also became less directly dependent on each other for survival. As a substitute for the direct transmission of data through chemical relays, these independent organisms began developing a new mechanism for transmitting knowledge: imitation. By observing, responding, and mimicking their peer organisms, these creatures could effectively leverage each other’s senses, experiences, and decision-making capabilities.

The archetypal success story of the Cambrian Era is the trilobite, a wildly prolific organism whose numbers at one point circled the entire planet (its survival as a species was aided in no small part by its predilection for wild, shells-off mating orgies). These organisms were self-contained, self-directed, and less dependent on a constant stream of data inputs for survival. Rather, they had evolved to the point where the individual organism had the resources to ensure its own immediate survival: sensing, responding, eating, and mating. But they were not exactly what you would call independent thinkers.

These new self-directed organisms still relied heavily on their peers for survival and adaptation. As a substitute for the direct transmission of data over the biological network, they began developing a new mechanism for transmitting knowledge: imitation. By observing, responding, and mimicking their peer organisms, these creatures could effectively leverage each other’s senses, experiences, and decision-making capabilities.

These early social learning networks relied not on biological or chemical signals, but rather on imitative learning and the gradual development of behavioral “memes” that could persist beyond the lifespan of any one organism.

Pulitzer Prize-winning entomologist E.O. Wilson coined the term “sociobiology” to describe the study of social behavior from an evolutionary perspective. Wilson’s landmark 1975 book, Sociobiology: the New Synthesis, brilliantly punctured the prevalent scientific view that animal behavior could be adequately explained through the traditional disciplinary filters of biology, chemistry, and genetic inheritance.

Wilson argued that the social learning mechanisms evident in other species required a new perspective, a “synthesis” of biology and the social sciences. Wilson argued that “learned modifications of behavior are not inherited; only the innate predispositions are inherited, and only these can evolve by natural selection.” In other words, social groups can transmit and preserve knowledge through non-biological means, forming “learning machines” with remarkable powers of collective memory, calculation, and distributed decision-making capabilities.

Regrettably, Wilson’s work has been misinterpreted by certain doctrinaire humanists, who have chosen to infer parallels between sociobiology and pseudo-sciences such as genetic determinism or—worse yet—eugenics (the dark science of Nazi genetic engineering). Alas, like Darwin, Wilson’s theories have lent themselves to misuse and misappropriation by groups with political or social agendas. Some feminists, for example, have objected strenuously to the entire discipline of sociobiology on the basis that it seems to offer an apologia for male dominance. Wilson himself would vigorously protest such abuse; he has frequently cautioned against perversions of science in the service of political “advocacy.”

Thanks to the conceptual foundation of sociobiology and, more recently, evolutionary psychology, we are beginning to understand the complexity and sophistication of nature’s super-organisms, some of them seeming to exhibit properties once thought the exclusive province of humanity: language, reason, even the outlines of culture.

For a glimpse of how these early “learning machines” may have operated, we need look no further than some of our contemporary planetary species.

Hive minds

4_hive Perhaps the most widely studied examples of nature’s collective learning machines are insect colonies. Ants and bees in particular exhibit remarkable powers of collective reasoning and an ability to accumulate and share sensory data in social groups.

Wilson devoted much of his career to studying the behaviors of ant colonies, which perform seemingly complex feats of calculation and geometry, and elaborately orchestrated group warfare.

Throughout the insect world, colonies of individual organisms appear to exhibit powers of reasoning seemingly not predicted by the capabilities of a single organism. Douglas Hofstadter first applied the term “emergence” to the behavior of ant colonies in his landmark essay “Ant Fugue,” in which he described the dual nature of individual ants as both functioning organisms and as, in effect, signals.

In his 2001 book Emergence, Steven Johnson explored emergence theory as a context for explaining the self-organizing properties of internet communications, and as a construct for self-directed software agents in a future, more intelligent incarnation of the World Wide Web.

While a few software developers have attempted intriguing experiments at modeling software after the distributed behaviors of ant colonies, we should bear in mind that that the essential mechanisms of colony behavior cannot be solely explained in terms of mechanistic or mathematical models. Wilson argued that insect colony social behavior must be properly understood as “an idiosyncratic adaptation” to the surrounding environment, rather than a purely mechanical operation.

In other words, these behaviors constitute distinctly social responses, transmitted across generations through an elaborate dance of imitative learning and adaptation. There is another force at work here: information.

Monkeys in the mirror

5_monkeysmirror Ever since Carl Linnaeus boldly decided to group humans with monkeys and apes into a family he designated “primates,” we have looked to these close evolutionary cousins for clues to our own behavior patterns.

Although we may tend to think of “culture” as a uniquely human trait, numerous primate studies have revealed the presence of localized social traditions, rudimentary language, and the facility for transmitting learned knowledge across generations.

Dutch primatologist Frans de Wahl recounts an experiment in which he introduced a group of rhesus monkeys—a particularly argumentative, pugnacious group—with a troop of more even-tempered stump-tailed monkeys. Within a few months, the rhesus monkeys “developed peacemaking skills on a par with those of their more tolerant counterparts” through imitative learning and ritual displays. Most importantly, the rhesus monkeys carried on these behaviors long after they had been permanently separated from the stumptails. In other words, social transmission of knowledge effected a permanent change in group behavior.

All primate cultures seem to rely on learned—rather than genetically determined—social arrangements, which often vary between different social groups within the same species (as demonstrated in numerous chimpanzee studies).

While these social knowledge transmissions have no external symbolic manifestation—other primates don’t write books or create external symbolic language—they do nonetheless create, store, and transmit social knowledge that persists across generations: surely a manifestation of the same impulse that drives us towards information architecture.

The disintegration of hierarchy

6_disintegrationThroughout most of human history, information has flowed through small groups in ways not so different from the imitative social learning mechanisms evident in other primates.

Only in the past four thousand-odd years of recorded human history have we developed the capacity for symbolic representation—and with it, a new externalized construct of “information.”

The rise of written language paralleled (and facilitated) the rise of the modern institution: churches, governments, universities, and corporations, to name a few. As these larger collective entities began to supplant the close-knit family and kinship bonds of earlier social groups, the institution also took on a new function as a container for shared knowledge—what Francis Fukuyama has called the “knowledge bureau.”

Fukayama has argued that the rise of the “knowledge worker” in Western society, coupled with the liberating effects of communications technologies, is gradually undermining these institutional hierarchies that have characterized our collective social experience for the past four thousand years. And with the fragmentation of institutions comes the upending of traditional knowledge bureaus.

In The Social Life of Information, John Seely Brown and Paul Duguid draw the distinction between “fixed” sources of information that are typically the province of institutions (such as government records, books, and other documents) and the “fluid” information that tends to emanate from individuals and small groups—(such as email, instant messages, and threaded discussions).

Howard Rheingold has recently chronicled the rising tide of fluidity in newly evident social phenomena like “smart mobs” and “flocking”—social behaviors in which large groups of individuals act in seeming concert, without any apparent organizational hierarchy at work. From recent political riots in the Philippines to more recent mass events like the Nigerian Miss World Pageant riots, or the unprecedented wave of recent anti-war protests, we seem to be undergoing the early stages of a dramatic transformation in the behavior of social groups.

If we look closely at the behavioral dynamics of these new behavior patterns—widely dispersed, non-hierarchical social relay systems—we can easily recognize the contours of earlier patterns of communication and knowledge-sharing evident in every species to the earliest forms of life on this earth. While these recent phenomena may seemingly result from modern technologies, they appear to manifest some very old patterns of social learning and knowledge sharing.

From social networks to social capital

7_socialToday, the practice of information architecture remains primarily an institutional endeavor, driven by the needs of corporations, governments, and educational institutions.

Today’s information architects are the heirs of yesterday’s scribes, clerks, and clerics: laboring to acquire, store, and disseminate knowledge for the sake of humanity, but ultimately in the service of institutions.

Now, some IAs may protest that assertion, arguing that the practice of IA is not about the organization, but about “the user.” But I would argue that if we look closely at that elusive user, we may discover not real human need but a flimsy straw man, a construct designed to serve an intrinsically institutional agenda.

What evolution teaches us is this: in order to understand the deeper roots of our need to generate and manage information, we need to look beyond the individual organism, towards the social groups that drive the mechanisms of evolution and adaptation for all species.

In recent years, the term “social software” has gained currency as a rubric for describing a new breed of software: groupware, social network visualization, discussion lists, and a host of other collaborative tools that support the needs of small, self-selected groups of individuals rather than organizational imperatives.

The real promise of social software has less to do with commercial productivity, and more to do with generating social capital: trust, social engagement, and the development of sustainable knowledge-sharing mechanisms that enable our advancement and evolution within social groups.

What does this mean for information architects? Over time, I believe we may find ourselves progressively less focused on solving the problems of institutions, and increasingly attuned to the needs of groups: a new kind of information architecture—and a very old one too.

Brown, John Seely and Duguid, Paul. The Social Life of Information. Harvard Business School Press, February 2000.

Johnson, Steven. Emergence: The Connected Lives of Ants, Brains, Cities, and Software. Scribner, September 2001.

Wilson, Edward Osborne. Sociobiology: The New Synthesis. Harvard University Press, September 1975.Alex Wright is a writer, information architect, and former librarian who lives and works in San Francisco. He maintains a personal web site at