Digital [draft] [#digitalkeywords]

“Perhaps we should understand the digit as an openly imitable and probabilistically imperfect index of any thinkable world, including this world, with which there can be no final convergence.”

 
The following is a draft of an essay, eventually for publication as part of the Digital Keywords project (Ben Peters, ed). This and other drafts will be circulated on Culture Digitally, and we invite anyone to provide comment, criticism, or suggestion in the comment space below. We ask that you please do honor that it is being offered in draft form — both in your comments, which we hope will be constructive in tone, and in any use of the document: you may share the link to this essay as widely as you like, but please do not quote from this draft without the author’s permission. (TLG)

 

Digital — Benjamin Peters, The University of Tulsa

“Every digital device is really an analogical device.” [1] Norbert Wiener

“The days of the digital watch,” the playwright Tom Stoppard once joked, “are numbered.” The pun may prove prescient: the keyword digital-derived from the Latin “digitalis” from digitus or “finger, toe”-has enjoyed a steady rise from almost nothing before the 1950s to a top 2500 word in contemporary English that applies to everything from electronics (not only the digital watch, but also the camera, clock, computer, disc, video), to social descriptors (digital divides, natives, and revolutions), to emerging fields of inquiry (digital art, humanities, physics, and studies). Given all this, however, its heyday as a keyword may already have passed: a “digital computer,” for example, is almost unheard of today exactly because they are so common, while its counterpart “analogue computers” are now marked historical oddities. Likewise “digital photography” and “digital television” are quickly becoming simply photography and television. In other words, the sweeping success of digital techniques may render the term a quintessentially twentieth, not twenty-first, century keyword.

Before the term referred to electronic computing media, the term signaled a deep and broad range of discrete signification techniques.[2] The most ancient of these traces back to the origins of the keyword: the digit, or the index finger that can both count and point. The coin, the yad (“hand” or Torah pointer) and the manicule (or “pointing hand,” “index,” or “digit” in the margins of 11th to 18th century typography), the piano keyboard, filing systems, the typewriter, and the electronic telegraph too are “digital” in the sense that humans interface with these media digitally, or with our fingers via manual manipulation and push buttons. The touchscreens we pet and caress today are heirs to an ancient tradition of fingers digitally counting and pointing out literate lines.

 Ever since we evolved extensor indicis muscles connecting arms and fingers, ours has literally been what media theorist Teil Heilmann calls a “digital condition”: digital media do what our fingers do. In particular, I will argue that digit media count the symbolic and index the real. In other words, as is common convention, they can compute, or discretely number and manipulate numbered objects; but digits do something else too: they can also point, or index and reference objects at a distance. In what follows, informed by information theory, semiotics, and language philosophy, I argue that our understanding of digital media as counting and computing discrete symbols precisely should be balanced by the fact they also point to and index non-digital elements of reality approximately. Modern students and scholars foreclose against a fuller understanding of the limits of our digital condition when we seek to understand digits only computationally. By reviewing in two parts how digits function as both counters and pointers, or computers and indices, I hope to shed insight on how the digital has come to both inhabit and imitate our modern world.

Counting the Symbolic: The Triumphs of Digital Computing

A recent publication in Science claimed that the total computing power worldwide has enjoyed a staggering compound annual growth rate of 83% since 1986.[3] In fact, digital computing power has proliferated exponentially since at least 1946, when the mathematician John von Neumann showed at the first Macy Conference on cybernetics that all signals can be converted into digital format simply by introducing a discrete, symbolic threshold: at or above this level, call the signal one; below that level, call it zero.[4] For example, the meridian that the sun crosses overhead in the sky is the threshold between morning and afternoon. The psychoanalyst Lacan might say that analog-to-digital conversion seeks to suppress “the real” with “the symbolic,” and here I understand, with media theorist Kittler, “the real” as those physical, continuous, material, and analog elements of our world that can be recorded by a video or phonograph, while “the symbolic” makes up all the artificial, discrete, logical, and digital elements that can be recorded by a typewriter.[5] For example, the typed time of the clock supplants the real time of the sun overhead.

 The point, first articulated by Leibniz and later formalized by the logicians Boole and Shannon, is simple: all real signals can be reduced, with certain loss, into digital symbols. Anything one wants to describe, whether content (sensory experience of matter), space (coordinates), time (intervals), or instructions (programming, algorithms), can be expressed in the irreducibly countable alphabet of that one binary difference, 0 or 1. As the logician Alan Turing showed in 1937, even the most basic digital computer, given enough time and memory, can solve any computable problem.[6] Since then “universal Turing machines,” or general-purpose digital computers, have led the initial spread of, in cyber scholar Jonathan Zittrain’s phrase, “generative” digital devices.[7] We live increasingly among interconnected devices that can compute anything countable.

 The momentous logic of digital computing, taken to its extreme, leads to the increasingly prominent position that everything that is, is in fact countable. Information physicists, for example, contend that nature has always already been digital, or the real is at base symbolic: magnetic poles have north and south poles, electrons are positively or negatively charged, and quarks spin either up or down. We have long been made of digital quanta. In media theorist Friedrich Kittler’s phrase, only that which is switchable can be (“nur was schaltbar ist, ist überhaupt”) or as the theoretical physicist John Wheeler put the worldview, “it from bit.”[8]  In ASCII, as in all programming, there can be no mistake: each symbol is uniquely encoded. Cognitivism, too, seeks to distill the vagaries of memory, emotion, and experience into the biomechanics of synaptic firings across neurological circuits. It is as if in the beginning was the bit, and the computing of bits-from stone coins to bitcoin-has since overflowed modern life. Hillel Schwartz, in his monumental The Culture of the Copy, has claimed that the defining characteristic of modernity (and fully embodied in the digital age) is its preoccupation with exact copying and its discontents.[9] This is the first feature of the long legacy of the digital: metadata aside, digits are copied with uncanny exactness.

 The more digital media spread, the more exacting and all-encompassing our counting regimes become. As critic Walter Benjamin pointed out in 1936, the mechanical reproducibility of content brings with it a new aesthetics of imitable art, such as in contemporary remix, DIY, pastiche, and bricoleur cultures online and off.[10] The rise of big data analysis too depends on computing power scaling trivially from sample set to the population of data, from scanning with eyes a brochure to the scanning with algorithms the Library of Congress. Democracy enthusiasts extol the virtues of online voting and debate, where all voices, we muse hopefully, might count equally. Chess enthusiasts hunger after (and fear) a complete book of moves online. The search bar temptation prompts us to imagine that Borges’ all-containing catalog is nearly within reach.

 Yet, the triumph of digital computing is not an unmitigated good for all, and especially for the disenfranchised many: in fact the larger the franchise, the more computing power serves its interests.[11] This digital Matthew effect, where the digital get more digital, is the dream of at once the information theorist, the universal strategist, the advertising executive, and the utopian futurist who proclaim the coming digital “singularity,” a term first coined by Stanislaw Ulam in 1958 in reference to von Neumann’s work suggesting a technologically driven paradigm shift in the history of the human race[12]: since the most fundamental building block of all that we know and are is already the bit of information-these computation enthusiasts contend-then the broader the spread of digital media, the more powerfully humans will be able to represent and reshape reality itself. Digital techniques will continue to render more of the computable world visible to those increasingly large organizations in control of computing techniques. The consequence for those occupying the commanding heights of computation today appears nothing short of a potential total digital convergence, a universe wherein all bits are known and in play at once.

Indexing the Real: How Digits Point Elsewhere

Digits, however, do more than just compute. Like fingers, they also point. And, as anyone who has been burned by a misplaced finger knows, pointing is far from an exact science. Just as the internal systems digital media compute are finite, rational, and discrete, so too must the external world to which the same media point remain infinite, irrational, and approximate, and it is this difference, I argue, that firmly insures against both the promise and threat of total digital convergence. In what follows I explore the tenuous work of digital media that point and refer to real world objects outside of themselves; I will argue that this transducing from the symbolic to the real limits both the computing and the indexing power of digital media.

 Another name for digits that point is indices (the plural of index). Charles Sanders Peirce, a founding pragmatist and semiotician, divided the world into three types of signs (unlike the signifier-signified binary of Saussure behind the postmodern turn): the icon, which like a portrait resembles the thing it points to; the symbol, which, like the word couch, means a place to sit only because convention has taught us to recognize it as such; and the index, which has a natural connection to the thing it points to but it not that thing, such as a symptom points to a disease while not being the disease itself.[13] The question of real-digital world convergence, as we will see, hinges more on how well digital media index than on how well they compute.

 To be an index then is to render approximately or refer to something outside of its own signifying system, and thereby to claim some non-necessary but useful connection to that thing. Non-digital indices abound: a book index points the reader from outside the body of a text to the right page in the body of the text, but not the exact phrase. The weather vane is not the wind, but it indexes that complex vector field into a single well-defined direction. Smoke too indexes fire: smoke is not fire, but it signals fire by saying, roughly, “follow me to find an ongoing combustion process.” For philosophers of language from Wittgenstein to Austin, this point is basic: all meaningful relationships begin by creating a semiotic structure that excludes something else (Goedel’s theorems make a sibling point: no computational system can be both complete and consistent on its own terms).[14] In fact, the founding information theorist Claude Shannon began by excluding from computational communication meaning itself.

 Digital media do the same. Our favorite social networking sites reacquaint us with friendly personas and profiles that point to but are not the friends we know in person. Google Maps gives a godlike view of the land surface we both know and do not know by presenting a reducible, scaling approximation of it. (To represent reality exactly a map would cost computationally at least as much as the reality it indexes.[15] ) Digitally programmed artificial intelligence, robots, prostheses, 3D printers, and animation too are useful and discomforting only so long as they imitate natural objects inexactly; the original digit, an index finger is useful exactly because it is not the object it refers to.

In his landmark 1948 article that ushered in a computational approach to communication championed in the first half of this essay, Shannon noted this same constraint: no act of computing can claim to understand how its messages relate to the real world. For him, computing and indexing are functions as distinct as fingers that count and fingers that point. He describes the indexing function thus: “frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities.” Then he adds, “these semantic aspects of communication are irrelevant to the engineering problem.”[16] He is not saying that digital media do not shape our world; rather he saying a computational understanding of digits can never speak to such matters. In other words, Shannon, a pioneer of the digital age, argues that the question of digital convergence, or the binding together of the symbolic and the real, is in fact wholly irrelevant to the computational nature of the digit. To understand such questions, we need a new idea of what the digital does, such as the index.

Note that Shannon’s black boxing of meaning does not mean that the ways that computers index, or that counters point, to real life are meaningless or uncorrelatable. In fact the very mathematical engine-probability and statistical mechanics-driving much modern-day computational convergences, including Shannon’s technocratic insights, rest on a surprisingly indexical, not only symbolic, relationship to reality. Probabilities do not just count what is, they point ahead to what could be with certain uncertainty. To say that there’s a 42% chance of rain tomorrow sounds mundane, but it actually exercises extraordinary license to infer from past data about multiple distinguishable futures-or, given a hundred future tomorrows, 42 will experience rain. All digital messages, all data must be treated, according to Shannon, as if they were but “one selected from a set of possible messages.”[17] Even though many messages we send and receive likely have meaning, the vast majority of mathematically possible messages are pure spam (Borges’ Library of Babel again makes this point). The indexical digit brings us back to a familiar point: a finger, like digital media, can point to anything, which means that most of what we point to most of the time (even to the future) is at worst probably meaningless and at best probabilistically meaningful.

 Digital media index not only our world but all possible worlds-even the parts of our world we would prefer not to see. This fact sobers digital convergence hype and at once showcases the true negatives and false positives behind genuinely digital social problems. Anyone tempted to believe in the coming digital singularity need only observe how rarely online avatars and dating profiles resemble their users. Symptoms of you and me lurk online. Or consider how Google Maps, a modest indexing compared to the digital ears and eyes of surveillance states, represents not only the relevant roadsides we recognize, it indexes the homeless, those accused without trial, and all others whose privacy our law, technology, and society do not defend.[18] By indexing all that we send, receive, and process into distant database, omnivorous data harvesting and coal-powered cloud computing techniques force users to exist in a world that can only be “saved,” and never deleted, with a click of a finger. Digital databases index the real with eerie accuracy (a recent study found, for example, that the metadata alone collected in NSA phone tapping have enough inferential power to invade personal privacy) as well as the risk of condemning errors (purchasing a union jack and certain soil fertilizers is enough to automatically file a customer on a terrorist watch list).[19] Critically understood, digital indexing techniques let the privileged symbolically construct worlds veiled from the same real world transformed by those same techniques in the service of the large and unscrupulous.

 In short, once we can see digital media as both symbolic computers and indices of the real, we must recognize, following Shannon, that digital computation has made the utopian dream of convergence a priori impossible and, at the same time, huge problems abound in and because of the accelerating computational correlations between the real and the symbolic, the analog and the digital.[20]

Summary

Digital media have been both counting the symbolic and pointing out the real since humanoids have had fingers, even though the explosion of computing power has swept up the digital to such a degree the techniques may now be outrunning the term. Its days as a keyword, as Stoppard jested, are numbered. To understand our digital age, we must understand not only the numbers-that digits count, compute, construct, and copy internally discrete symbolic worlds-but that digital media can point to or index all possible worlds, not only our real one. This second point helps counterweight, sober, and caution Whiggish enthusiasm for the ongoing digital revolution eventually leading to total data convergence. It is as if digital computing, without indexing, does the work similar to counting 1+1=2 on our fingers, while, with it, our digital media formula don real world units that apply with probabilistic, and never precise, degrees to all possible worlds around us.[21] Perhaps we should understand the digit as an openly imitable and probabilistically imperfect index of any thinkable world, including this world, with which there can be no final convergence. The last seventy years has ushered into existence a host of digital devices that now populate our pockets, warehouses, and working models of the world. The lot of these reality dopplegangers, like that of all digital-indexical media before them, is to point to endless and imprecise imitations of their makers.


Notes

1. Wiener quote, quoted in Claus Pias.[]

2. Cf. Bernard Siegert, Passage des Digitalen: Zeichenpraktiken der neuzeitlichen Wissenschaften 1500-1900. 

3. Martin Hilbert and Priscila Lopez, “The World’s Technological Capacity to Store, Transmit, and Compute Information.” Science, 2011, Vol. 332 no. 6025 pp. 60-65.

4. Von Neumann, Macy Conference transcripts; see also Heims, The Cybernetic Group.

5. Claus Pias, “analog, digital, and the cybernetic illusion” accessed 2014 05 01 at https://www.uni-due.de/~bj0063/texte/kybernetes.pdf; cf. Friedrich Kittler’s Phonograph, Film, Typewriter.

6. Turing, “On Computable Numbers, with an Application to the Entscheidungsproblem.”

7. Jonathan Zittrain, The Future of the Internet. Yale UP: 2008.

8. Tom McCarthy, “Kittler and the Sirens” London Review of Books (find original quote); Wheeler, 1990, “Information, physics, quantum: The search for links” in W. Zurek (ed.) Complexity, Entropy, and the Physics of Information. Redwood City, CA: Addison-Wesley.

9. Schwartz, The Culture of the Copy. MIT Press.

10. Walter Benjamin, “the Work of Art in the Age of Mechanical Reproduction”

11. Langdon Winner, “Mythinformation.”

12. Ulam, S. 1958. “Tribute to John von Neumann.” Bulletin of the American Mathematical Society, 64 1-49.

13. C.S. Peirce, on icon, symbol, index, multiple source beginning 1886, and later.

14. Insert multiple Wittgenstein, Austin, and Goedel computability references.

15. John Durham Peters, “Resemblances Made Absolutely Exact: Borges and Royce on Maps and Media”

16. Claude Shannon, “A Mathematical Theory of Communication” 1948, 1.

17. Shannon, 1948, p. 1.

18. Siva Vaidhyanathan, The Googlization of Everything, p. 106-107, ibid.

19. Jonathan Mayer and Patrick Mutchler, “Metaphone: The Sensitivity of Telephone Metadata.” Accessed 2014 05 02 at http://webpolicy.org/2014/03/12/metaphone-the-sensitivity-of-telephone-metadata/

20. An extended orphan footnote: Perhaps a way forward can be found in recognizing how digital media reveal both the false divide as well as the false convergence of these realms. Consider debates about phonography, or sound reproduction technologies, that straddle real sound (phono) and the symbolic writing (graphy). Purist analog audiophiles and digital apologists have long disagreed which is better: a perfect record player that, like the LP, reproduces continuous sound waves or a digital approximation of those sound waves that, like the MP3, plays back the discrete sound waves in units smaller and speeds faster than the human brain can detect. The analogists contends that digits will always distort while the digitalists counter that if the human experience of reality is digital, then so should be our recordings of it (not to mention the pragmatic conveniences of digital audio mobility and storage).[] Digits that both count and point may help forge a third way forward, wherein, on the one hand, the analog purists must recognize in digital sound copying techniques, save for unique metadata, an exactness and fidelity to the digital copy that no LP can ever claim to the analog original; and on the other, we can also see in the MP3 of an original sound recording a larger truth about the “anything but” relationship of digital indexing: all digits at best approximate analogously, and never more, the real world event it signifies (layer in audio effects and filters post-production, and we encounter endless ways to make unreal music real again). Here the digital-analog divide dissolves. Not only is the digital analogic in indexing something beyond itself, but even that classic analog technology-the phonograph-depends on a kind of an indexical digit, transformed: i.e., the phonograph needle is a kind of a digital pointer applied so precisely to the real record grooves that the relationship it indexes between vibrations and patterns in electrical signal becomes iconic, not indexical: the needle’s motions resemble the grooves they represent. Perhaps thus in all indexing we see how digital media resemble analog media (and analogies): both analog and digital media index the real approximately, and how that will ever fall short of any iconic convergence between the real and the symbolic. See Jonathan Sterne, MP3: The Meaning of a Format. Also Sterne, “The Death and Life of Digital Audio.” Also, Eric Rothenbuhler and John Durham Peters, “Defining Phonography: An Experiment in Theory.”

21. Indeed, the same principle is inherent in the concept of digits as quantities: digits at their core are not, and cannot be, concerned with one-to-one representations with reality, no matter how strong our will to program them to do exactly that may be. Similarly, one may have five apples but never the quantity “five” alone: the digit or quantity “five” cannot map exactly onto anything outside of the mathematical world of other digits or quantities without an approximate real world unit. Units (i.e., an apple) may be subject to digital-symbolic packaging, but it can never be digital itself. Digits, once well compiled, reflect loose analogies with the world outside.

Comments are closed.