Considering how much attention we lavish on the technologies of writing—scroll, codex, print, screen—it\’s striking how little we pay to the technologies for digesting and regurgitating it. One way or another, there\’s no sector of the modern world that isn\’t saturated with note-taking—the bureaucracy, the liberal professions, the sciences, the modern firm, and especially the academy, whose residents, transient and permanent, have more right than anyone else to claim that taking notes is what we do.
That was more than sufficient justification for holding a conference called Take Note, held in November at the Radcliffe Institute for Advanced Study. But it\’s an indication of the place of notes in the hierarchy of written genres that a one-day conference on the subject, with the participation of historians, literary scholars, and technologists, could attract the attention of media like The Boston Globe and The New York Times. The Atlantic headed its article \”Duly Noted: The Past, Present, and Future of Note-Taking.\” With notes getting as little respect as they do, that description of the conference\’s ambitious compass conveyed none of the megalomania one would have registered if instead of \”note-taking\” it had said \”writing books.\”
There are notes and notes, of course: notes to oneself and notes to others; notes taken, made, jotted, and passed. Mash, doctor\’s, suicide, and condolence notes. Field, class, and case notes; notes for general circulation; foot and head notes, notes of hand. But it\’s the bookish notes that academics care most about, the ones that intervene between the things we read and the things we write.
I\’ve never been very good at those, and I feel the failure keenly. I can\’t read a long biography or history without reflecting on the bureaucratic efficiency with which the author must have collected and assembled the notes for it—\”as strictly as the authorities keep their register of aliens,\” as Walter Benjamin described the process—and how that requirement put those genres forever beyond my reach.
So it was probably rash to agree to take on the job of conference rapporteur, taking notes on talks about note-taking with the goal of assembling a plausible-sounding account of the presentations when the day was done. I figured it would keep me attentive, even jet-lagged as I was, but who was I kidding? As always, I wound up trying to call on the compensatory knack that we woolgatherers were forced to acquire in our undergraduate years, as we try to invest disconnected aperçus with an illusory narrative coherence. (We were rewarded later on when the humanistic disciplines took an anecdotal turn that made it hard to tell where isolated notes end and something else begins.)
But I had never much thought about notes as a topic of intellectual interest—how many people do?—much less as the occasion for a major conference, other than among the educational psychologists who are responsible for all of the first 50 hits that come up when you do a search on \”note taking\” in Google Scholar. Notes are tentative (\”rough\”) and instrumental—\”the tracks we leave in the sand as we head on our trail of discovery,\” as the historian and Radcliffe Institute dean Lizabeth Cohen put it in her opening remarks at the conference.
We don\’t think of notes as ends in themselves: For all its brevity, a tweet doesn\’t count as a note, but as a self-contained and \”finished\” publication. And when the word \”notes\” works its way into the title of a book or essay, it\’s meant to convey a provisional effort. Hence the formulaic incantation \”Notes Toward\” (\”… the Definition of Culture,\” \”… a Theory of Hierarchy\”) with which writers try to inoculate themselves against the charge of grandiosity.
Things were different in the 16th and 17th centuries. True, there have been notes for as long as there has been writing, as the historian Peter Burke observed. (I might take it back even further, to the notched tally sticks on which Neolithics ticked off the days and phases of the moon.) But the modern surge in note-taking, and in contemplating and theorizing about it, dates from the early centuries of print, as people tried to come to grips with an explosion of knowledge—\”that horrible mass of books that keeps on growing,\” in Leibniz\’s words.
In her book Too Much to Know (Yale University Press, 2010), the Harvard historian Ann Blair, one of the organizers of the Radcliffe conference, enumerated the tactics the early moderns deployed. They morselized books into compilations like the commonplace book, the florilegium, the adversaria, and the zibaldone. 1 They generated encyclopedic distillations and elaborate systems of indexing and classification. And they wrote numerous treatises on taking notes and devised ingenious contrivances for organizing them.
In his 1689 De arte Excerpendi, the Hamburg rhetorician Vincent Placcius described a scrinium literatum, or literary cabinet, whose multiple doors held 3,000 hooks on which loose slips could be organized under various headings and transposed as necessary.2 Two of the cabinets were eventually built, one for Placcius\’s own use and one acquired by Leibniz. It was an early manifestation of the principle that still governs our response to the knowledge explosion: The remedy for the problems created by information technology is more information technology.
There has always been a certain self-delusion in the academic\’s conception of note-taking as a purely instrumental process, one that serves merely to get us from the things we read to the things we write. As the media scholar Lisa Gitelman recalled, one could invert the relation, as Walter Benjamin did in One-Way Street, itself a collection of notes. For him, the rise of note-taking signaled the book\’s reduction into a purely transitional object, \”an obsolete mediation between two different filing systems.\” Everything that matters, he said, could be found in the card boxes of the researcher who wrote it, which the scholar studying it had merely to incorporate in his own card index.
Benjamin thought of the \”rise\” of note-taking literally. As he saw it, notecard boxes were just one of the forms, along with newspapers, advertisements, and placards, that were restoring writing to the perpendicular plane it had occupied in the epigraphic age, before it took to bed—temporarily, it turned out—in the horizontal pages of the printed book. The verticalizing card box, like Placcius\’s cabinet, was a solution to a persistent problem: The surfaces that are most convenient for note-taking are often inconvenient to store, organize, or preserve.
The margin of a page is the handiest place to make a note, but it has to remain there unless it\’s transferred to something more mobile, like a slip of paper (of one sort or another: Montesquieu took notes on playing cards, and Arthur Maling, one of the early editors of the Oxford English Dictionary, recorded word citations on chocolate wrappers). Or speaking of handy, as the Harvard literary scholar and conference co-organizer Leah Price pointed out, there\’s the hand itself, a perennial recourse for student exam-takers, as well as for public figures like Sarah Palin, who referred to notes on her palm in delivering a 2010 speech to the National Tea Party Convention, in which she described President Obama as \”a charismatic guy with a teleprompter,\” her own low-tech method presumably having obviated the need for one. 3
Slips and cards are handy and easy to sort and re-sort, but they\’re also easy to misplace and can be scattered by an errant breeze. To overcome those problems, they could be sewn together, which was Pascal\’s expedient, or glued to the pages of a book, though both those procedures fix them permanently in place. To store the millions of citation slips used in preparing the OED, James Murray installed a grid of 1,029 pigeonholes in the scriptorium he built in the garden of his house, in Oxford. Samuel Johnson used three methods in preparing his Dictionary, marking citations of words in the margins of books, to be transcribed by his amanuenses to slips, which in turn were glued to sheets of paper in preparing the individual entries.
The process would have been simplified had he had the benefit of the repositionable sticky note introduced by 3M, 30 or so years ago, under the inapt name Post-it. (We don\’t talk about \”posting a note\” in a book, and those little yellow strips aren\’t very suitable for posting a message on a telephone pole.) The Post-it ranks as one of modern chemistry\’s two major contributions to the work of annotation, as partial reparation for the highlighter pen, the colorist\’s revenge on the printed page.
There\’s no evidence that Leibniz made any use of his literary cabinet. Despite his lifelong interest in organizational schemes—he designed one of the earliest book-indexing systems—Leibniz\’s note-taking was as disorganized as it was obsessive. Often, he said, he found it easier to repeat a piece of work than to locate his earlier notes amid \”a chaos of jottings that I do not have the leisure to arrange and mark.\” At his death, he left a Nachlass of several hundred thousand notes and loose papers, which is still only partially published. It\’s my guess that he acquired Placcius\’s cabinet out of the same yearning for order that drives me to acquire new organizing systems, as I contemplate the scattered piles of notes and papers that make my workspace look like downtown Pompeii. Though for me, in the digital age, order seems even more tantalizingly just out of reach.
It\’s no coincidence that the appearance of new—and essentially vertical—display surfaces coincides with the re-emergence of notes as first-class epistemological citizens and the demotion of the book to a form of informational packaging whose work is discharged once its contents have been assimilated into someone\’s system of notes and bookmarks. \”There is fantastic information in books,\” the Google co-founder Sergey Brin says in explaining the motivation for the company\’s book-digitization project. \”Often when I do a search, what is in a book is miles ahead of what I find on a Web site.\” It\’s in the nature of the search procedure called Googling, after all, that we address a book as a collection of propositions that we access by entering some words and barreling in sideways. (PowerPoint goes further still. Posting one\’s lecture slides accomplishes the note transfer automatically, without requiring conversion to a different filing system, once the notes have been ceremonially verticalized and consecrated in the class session. Yet however detailed the slides are, students seem to feel compelled to take notes on them, as if they need something to do with their hands.)
Digital note-taking systems were a direct outgrowth of the early hypertext knowledge-representation systems. I had my first encounter with one of those when I arrived at the Xerox Palo Alto Research Center in the mid-1980s. In addition to their better-known innovations (the laser printer, the WYSIWYG text editor, the graphical user interface, the Ethernet), the center\’s researchers developed the system Notecards. It was a thing of wonder, back when the computer could still induce that feeling. You could create notecards containing text or graphics, sort them into file boxes, and link them according whatever relationship you chose (\”source,\” \”example,\” etc.), while navigating the whole network via an overview in a browser window. It was as close as you could come to a digital implementation of Placcius\’s cabinet, freed from the material constraints of slips, hooks, and drawers and from the requirement that each slip fill only one slot in a network.
I set to work organizing a book project, with the idea that if I could lay out the ideas and structure so the whole was visible at once, the actual process of writing would resolve itself into two simpler components: the writing down and the writing up. But the system failed me—or rather, I failed it. There are plenty of books that are written in just that way, and you can tell right off; the commonplace book survives in the academic treatise whether or not one actually numbers the paragraphs. But I\’ve always found that the atoms and relationships resolve themselves only in the act of trying to tell the story. What you learn, after a while, is that writing is 10 percent inspiration, 10 percent perspiration, and 80 percent transitions.
Still, I keep trying, assembling project notes in one medium after another—Evernote, Word\’s notebook view, SOHO, stickies, Google docs, and those carnets with the little squares that I stock up on when I\’m in France—to the point where my personal knowledge base, like Leibniz\’s, is scattered across a farrago of incommensurable schemes—the way my books would look if I just kept adding new shelves without ever reorganizing the old ones. It\’s a little disheartening, but not a cause for dejection as it must have been for Leibniz. If I need to find something, I can always run a search over the whole accumulation. We need never lose track of any thought in the age of search, only of its place in the order of things.4
One doesn\’t have to go it alone, of course. Placcius envisioned that his literary cabinet would permit \”social excerpts\” by groups of scholars such as the newly formed academies and learned societies. And scholars ever since have looked for ways to make reading and annotation into collaborative processes—the developers of Notecards promptly went on to develop systems for facilitating collective authorship. But the interest has accelerated in an age that has made it an article of faith that there\’s nothing we do by ourselves that wouldn\’t benefit from a little help from our friends.
A search on \”social note-taking\” turns up dozens of apps and systems, most designed to facilitate activities like project management, but some aimed at transforming reading itself into a collaborative activity. At the Radcliffe Institute conference, David Karger described NB, a \”social document-annotation system\” developed by the artificial-intelligence lab at the Massachusetts Institute of Technology that enables students to engage in discussions about the annotations they post in the margins of texts. The system has been successfully deployed at a number of universities, chiefly in classes in the physical sciences, where the larger the group, the more likely it is to converge on the correct answer—we have more confidence in the authoritativeness of Wikipedia\’s articles on probability distributions and solar wind than in the ones on romanticism and street art. But in humanities classes, too, there are plenty of times when students have every incentive to throw themselves on the mercy of the wisdom of the many. The collective may not arrive at the definitive meaning of Critique of Pure Reason, but it\’s apt to come closer to the one the professor has in mind.
Some people go further, arguing that reading itself should be thought of as essentially a collaborative activity. Bob Stein, founder of the Institute for the Future of the Book, suggested that \”the idea that reading is something you do by yourself is very, very recent.\” 5 The institute has created its own platform for collaborative reading, and a few years ago it commissioned an experiment in which seven writers and critics collectively read and annotated Doris Lessing\’s The Golden Notebook. In the future, Stein suggests, we\’ll think of a book less as a physical object than as a \”place to congregate,\” in response to which David Weinberger, a researcher at Harvard\’s Berkman Center for Internet and Society, tweeted provocatively that \”private note-taking seems selfish to me. Make it all public.\”
But our fellow readers have never been entirely absent from view. Think of the tens of thousands of used copies of Pride and Prejudice still in circulation with \”It is a truth universally acknowledged …\” highlighted or underlined and \”IRONY\” written in the margin, as readers affirm that they\’re in on the game. In theory, it\’s true, the e-book relegates the contributions of those other readers to the digital shadows. But if you miss them, really miss them, your Kindle will oblige you by flagging the sentences that others have highlighted. Or you can refer to Amazon\’s list of the most highlighted passages ever, posted to \”help readers focus on passages that are meaningful to the greatest number of people.\”
The opening sentence of Pride and Prejudice comes in second place, just behind one from The Hunger Games, whose selections occupy 13 of the top 15 slots. As Ann Blair has said, the annotations made by ordinary readers have always been useful windows on shared patterns of thought, and those of us anxious about the fate of literary culture can take some comfort in knowing that Austen can still claim so high a place in Amazon\’s collective florilegium. Still, you might wonder if it\’s possible to grasp the irony of her sentence without undermining one\’s faith in the wisdom of crowds.
The fact is that books have always been places to congregate, even when we\’re alone in a room with them. To read a book is to enter into a kind of communion with an \”imagined community,\” in Benedict Anderson\’s famous phrase—\”imagined\” being the key word. It isn\’t just that you don\’t need to know who your fellow readers are or what they\’re thinking, but that that\’s generally preferable. As those used books remind us, the sense of sodality with what Dr. Johnson called the \”nation of readers\” isn\’t always deepened by seeing the comments our compatriots have obtruded in the margins. 6 Things are different, of course, when the annotators aren\’t our literary compatriots. Marginalia increase in value, figurative and literal, with the temporal distance between the reader and their creator. Even when they\’re banal, they\’re banal in revealing ways. The past is a foreign country; they annotate things differently there.
1. In Shakespeare\’s time, as the Oxford Elizabethanist Tiffany Stern recalls, people used portable and erasable \”table books\” to record their observations, lines from plays, and \”saws of books,\” as Hamlet calls them as he vows to obey the Ghost\’s entreaty to \”Remember me\”: \”My tables, my tables, meet it is I set it down.\”
2. The idea of the cabinet had been proposed a half-century earlier by the British scholar Thomas Harrison. But Parliament denied him the money to build a prototype, thus making him the first in a line of distinguished hypertext pioneers, including Vannevar Bush and Ted Nelson, to whom it wasn\’t given to see their visions implemented.
3. Some would add here the starched cuff that figures in the expression \”speaking off the cuff.\” As best I can tell, though, that cuff has always been purely proverbial; the expression has never meant anything other than \”speak extempore.\” Indeed, it didn\’t catch on until the 1940s, when the detachable shirt cuff was already going out of fashion.
4. Most people who talk about \”search\” are referring specifically to varieties or successors of grep, the tool developed for the Unix operating system in 1973 that allows one to search for strings or patterns of characters. Not that grep obviates the need for organizing one\’s notes, but it offers some comfort to those of us who are annotationally challenged, though it helps if we\’re able to recollect that the passage we\’re searching for contains \”ranunculus.\”
5. Well, not really—recall Hamlet\’s irritation when his solitary reading is interrupted by Polonius, or Jane Austen behind the curtain in the window seat with Berwick\’s British Birds, or David Copperfield \”sitting on my bed reading as if for life.\” The \”out-of-body raptness\” induced by the solitary reading of what Leah Price calls the absorbent book was a persistent image long before the expression \”curling up with a good book\” appeared, in the 20th century (as it happens, in the same year as the introduction of the first La-Z-Boy). What\’s really novel is the idea that with new technologies, even the activity of silent reading can be accomplished collectively, though without the advantage of having a speaker who can do the Police in different voices.
6. Nowadays we also encounter our fellow readers via the comments threads appended to Web-based texts. Those, too, can be disconcerting; you wonder how Austen\’s readers might have received the first sentence of Pride and Prejudice if it had been posted online. (\”Universally??? Ya think??\”) I try to avoid reading the comments when something of mine is posted, bearing in mind the admonition of Denis Donoghue: \”I keep my soul pure by attending to thetheme and letting the millions of readers behind my shoulder do whatever they like. I am a proud spirit, and they for ever clay.\” But I usually wind up peeking, get irritated—\”Hello? Is that what you think I meant?—and have to restrain myself from replying in the persona of a sock puppet. Some communities are best left to the imagination.
Author Bio:Geoffrey Nunberg, a linguist, teaches at the University of California at Berkeley\’s School of Information. His most recent book is Ascent of the A-Word: Assholism, the First Sixty years, published by Public Affairs.