"A notation system or, as we have chosen to translate, a discourse network has the exterior character--the outsidedness--of a technology. In Kittler's view, such technologies are not mere instruments with which "man" produces his meanings; they cannot be grounded in a philosophical anthropology. Rather, they set the framework within which something like "meaning," indeed, something like "man," become possible at all."

--David E. Wellberry[1]

It seems clear that we have passed some critical point in our use of technologies for information storage and retrieval. Our technologies have changed, and with them the networks that mediate that which constitutes "meaning," "man," and "knowledge." It warrants noting, however, that the verb to inform and the noun information are rarely used as cognates in the late twentieth century. Information means data flow increasing in bandwidth everyday, dataflow that increasingly permeates every aspect of daily life as it converges upon the eventual connection of every microprocessor in every electronic device in use. Whether you are filled with excited anticipation or sinking horror by the clarion "You Will" of AT&T's television commercials, you probably suspect that there is much more at stake than the slick ads display. It is not uncommon to hear that our relationship to information is changing with our technologies, and even less uncommon to hear that it ought to change. Whenever one speaks of the necessity for change, one is talking about an arena of struggle. The following discussion examines this struggle, some of the negative consequences of attempting to persist in our current relationship to information, and points a waay toward a potential re-enchantment with our information technologies. The struggle is a struggle for the design and control of an organizing principle that is both a root metaphor and an everyday engineering problem: "code."


What is meant here by "code" will have to wait for a full development. Suffice to say, what is not meant by "code" is a simple system of encryption by which one set of symbols stands in replacement for another in order to cache a secret message. Nor is the focus upon moral, civil, or dress codes, all highly formalized and rigid systems of control.[2] The concern here, and what is being fought for, concerns the coding of codes, the code that stands for nothing in and of itself, whose relation to objects and symbols is one of control and manipulation, not of representation. Or as Baudrillard might have said 'the code that is more code than code...hypercode.' Sadly though, this is about the only hyper-transformation that he does not concede. He marks his terrain by asserting that "[w]e are in a logic of simulation which has nothing to do with a logic of facts and an order of reasons...[t]he real does not concede anything to the benefit of the imaginary: it concedes only to the benefit of the more real than real (the hyperreal) and to the more true than true. This is simulation."[3] However, Baudrillard never, even when discussing code, speaks of hypercode, but reduces the idea of code to its most simplistic one: the binary opposition.

But code is more complicated than the bit to which Baudrillard reduces it, and hypercode even more so. Hypercode involves the coded manipulation and control of code itself; two inexact models of this might serve to clarify: compression and metaphor. Compression has become crucial in order to help handle and manipulate an increasingly elephantine info-structure. Popular archival programs encode other programs in order to compress their size for transmission and storage. This kind of encryption is never a question of caching secrets, but rather an instance of the manipulation of code with code. This process is, of course, reversible and therefore demonstrates qualities of being a transcription that hypercode does not share. For example our second model, metaphor, is a trope concerned not with transcription or even with referentiality, but with the manipulation of language. To "literalize" a metaphor is to transcribe the code to another code, and in so doing, destroy the metaphor. In this sense, literal and figurative language cannot be seen to be just two ways of saying the same thing, but rather as two codes, themselves both abstraction. Metaphor has been described as a "way of seeing," and it is here that it is on the level of hypercode. Like metaphor, hypercode cannot be transcribed, it is a "way of seeing," even a "way of knowing," and a way to manipulate that knowledge like the way in which compression utilities manipulate text files.

The struggle for control of hypercode is the battle for the shaping of the root metaphor that will inform and frame the way we act and react in our environments. In fact, hypercode will in a very real sense redefine both the "human" and the "environment" through such wide ranging technologies as hypermedia, virtual reality, interactive television, "smart" interface agents, and other technologies unforseable at the present time. These technologies "enframe," to borrow a Heideggerian term, not just by giving "humans" new extensions to the "environment," but by transforming the very way in which such concepts are thought. Even on a very mundane level, most members of the general public think of their brains as being storage and retrieval devices. How many people have you heard talk about "access" to memory? How many times have you heard words like "processing" used when referring to "thinking?" When was the last time you heard the phrase, "the brain is a muscle?" But in a very real way, the brain is no longer a muscle--some vestigial appendage of the Industrial Revolution's metaphoric reality--but a computer--part of a fundamentally new way of living relations to our bodies. The brain was a muscle when the dominant technology was mechanical: levers, engines, "workhorses" of the industries. The epistemological shift is a radical one. The industrial comparison is from engines to muscles and muscles to brains, now it is a direct comparison from "tool" to "body" and vice versa; the brain is a computer *and* the computer is also a brain. In fact, the tool/body distinction is in high flux as the computer increasingly merits the anthropomorphizing that is often conferred upon it. Where does the body stop and the tool begin?


What can we know about a system in which we are presently living? What are the conditions for the possibility of altering or manipulating the frame from within? [4] Discussing his approach to textual information in the afterward to _Discourse_Networks_, Friedrich Kittler differentiates between traditional literary criticism and "discourse analysis":

"Traditional literary criticism, probably because it originated in a particular practice of writing, has investigated everything about books except their data processing. [...] Discourse analyses, by contrast, have to integrate into their materialism the standards of the second industrial revolution. An elementary datum is the fact that literature (whatever else it might mean to readers) processes, stores, and transmits data, and that such operations in the age-old medium of the alphabet have the same technical positivity as they do in computers. [...] What remain to be distinguished, therefore, are not emotional dispositions but systems. [...] Whereas interpretation works with constants, the comparison between systems introduces variables. If the latter pursues historical investigations, then 'at least two limiting events' are indispensable, for which either systemic differentiation or communicational technique can be considered criteria."[5]

Kittler marks "universal alphabetization circa 1800 and technological data storage circa 1900" as the "limiting events for his preceding study of the discourse networks of 1800/1900. For Kittler, delimiting the endpoints is critical. The title of the book suggests that it is not a question of comparison, but one of the ratio of 1800 over 1900; you can only compare and contrast discrete elements, but a discourse network is a continuous function. The phrase "continuous function" is not meant as a metaphor but as a literal description of the way in which Kittler pursues his course of study. His techniques are mathematical and marked as such. Kittler says that "discourse analyses have to integrate," that you need "limiting events." His is a calculus of analysis. Kittler's technique is to establish where the "systemic differentiation can be considered criteria." In other words, where the continuous function displays a change in behavior--where its derivative (i.e. differentiation) is zero or undefined.

Kittler sets up his ratio of 1800/1900, finds the limits of integration of the numerator and denominator, and performs a summation of differential elements consisting of traditional and marginal scientific studies, literature, psychology, pathology, physiology, politics, historical accounts, and biography in order to give an idea of the behavior of the function that describes the discourse networks in which he is interested. Boundaries and differential elements are all one needs to perform an integration over a function, and Kittler's text performs exactly this in order to describe the behavior of these networks.

The notions of function and system are intrinsically linked. Function, in fact, is often used synonymously with system. Discourse network is therefore a somewhat unhappy translation of *Aufschreibesysteme*, "system of writing," in which the term "system" drops out from clear view. On the other hand, Kittler is fundamentally concerned with "discourse." To emphasize the relationship between hypercode and our present "discourse network" the "compressed" term DN2K will be used to refer to the latter. Perhaps the "D" cannot even be thought of as "discourse," but "data." Kittler's method does not rely on discourse as the object, it is simply that this is the dominant form of information storage and retrieval in the discourse network of 1900. It is the notion of system and its relationship to the mathematical function that supplies the conceptual framework for Kittler's analyses.

System analysis consists of four crucial steps: 1) establish the boundaries, 2) identify the discrete elements and their attributes, 3) map the relationships between the elements, and 4) map the behavior of this system as it interacts with its environment. In this general description, system analysis closely resembles how one deals with a mathematical function: 1) establish the range and domain, 2) identify the variables in question, 3) map, or plot, the behavior of this function as one variable relates to another, and 4) observe the function's behavior as it changes over time. Kittler exploits this relation between system and function in his network analysis, literally dividing the behavior of the network between his boundaries into differential elements, and performing a narrative summation.

To draw attention to Kittler's mathematics of analysis is not to repudiate this method of study but to demonstrate the reason for which an analysis of DN2K can not yet proceed by Kittler's method. Sufficient data concerning our emerging network is not yet available for the analysis to proceed as a definite integral. All that is currently known, and in varying terms that which is under hot debate in all fields of study, is that we have passed some critical point, that the behavior of the network function that creates the spaces in which our daily experience and understanding of the world takes place has "somehow changed." What we can study, therefore, is the bottom limit of our network curve, that which decidedly separates DN2K from the discourse network of 1900: hypercode.


The critical point between 1800 and 1900--the transposition of media--provokes the erosion of printed language's privileged status. That text is one medium among others is no longer an esoteric belief, but an informing principle. According to Kittler, two simultaneous processes catalyze this erosion: 1) the flight of ideas, and 2) the invention of other communicational and storage media. Both of these processes emphasize the importance of nonsense and its transcription. Logging nonsense, and even inducing the breakdown of meaning, is to underline the materiality of the medium. The result of these endeavors was to make this breakdown more interesting than the previously "normal" functioning of language, to examine these breakdowns as instances of code, rather than as anomalies to be expunged. It is in this sense that the discourse network of 1900 is concerned with de-coding, and transposition, toward the eventual goal of re-inscribing this logic upon the subject. Discussing Freud's image of the rebus and it's transposition as symbolic language to the signifier, Kittler says that:

"Psychoanalysis made into something significant--indeed, into the signifier itself--the nonsensical attribution of nonsense to the fact that someone confused precisely the letters m and n. [...] An inscription as meaningless as it is unforgettable can thus be decoded. The triumph of the Freudian transposition of media is to have made it possible to solve singular problems of differentiation with an individual experimental subject. [...] Freud's discourse was a response not to individual miseries but to a discourse network that exhaustively records nonsense, its purpose being to inscribe people with the network's logic of the signifier."[6]

Freud's technique is to understand the dream language as a rebus, a picture/letter puzzle, to break in into its syntactic parts, to separate its continuous messaging into the control codes that generate it, and to transpose these elements into signifiers. This transposition is from one code to another, a decoding of one medium to understand it in the medium of the discourse network in play.

DN2K still has fundamentally to do with code, but not with its decoding, this would not serve any conceivable purpose in this network. One certainly manipulates code in order to understand how it works, how it can be modified, but at no point is transcription an issue. There is no DN2K counterpart to the 1900 drive to transcribe coded dream messages, or any similar rebus, into linguistic signifiers. What does one get when you "decode" an executable computer file? Certainly not the program that was compiled to create the application. Machine language and programmer's code are not like cryptology; they are not parts of some secret language. Furthermore, there is no drive to inscribe people with the network's logic; in DN2K the codic logic is understood to be already functioning in people. Bodies are already envisioned and manipulated in exactly the same hypercodic ways as everything else.

Physiology, the study of bodies, without doubt plays a major role in the discourse network of 1900, but the emphasis is different in one fundamental way. Take for example Rilke's notion of playing the cranial suture, which for Kittler "designates the status of all script for a writer of 1900."

"The coronal suture of the skull [...] has--let us assume--a certain similarity to the closely woven line which the needle of the phonograph engraves on the receiving, rotating cylinder of the apparatus. What if one changed the needle and directed it on its return journey along a tracing which was not derived from the graphic translation of sound, but existed of itself naturally--well: to put it plainly, along the coronal suture, for example. What would happen? A sound would necessarily result, a series of sounds, music..."[7]

Kittler demonstrates that this image illustrates the common concern of Simmel, Freud, and Rilke in their endeavors to "track traces without a subject." This "writing without a writer" which "records the impossible reality at the basis of all media: white noise, primal sound" is fundamental in 1900, since to play the cranial suture is to deny the skull its "naturalness." To play the skull is to blur the line between the natural and the technological, but it is still as Kittler says, a "discursive manipulation." This kind of manipulation, although far from any kind of translation, is still a transposition from the skull as code to the phonographic as code. This is not yet hypercode.


What we have been calling "hypercode," the code of codes is the critical point between the discourse network of 1900 and DN2K. DN2K takes as its basis the noise that 1900 exhaustively logged, and has proven the post-structuralist demonstrations that meaning is a very small, and usually fragile, subset of nonsense. Deconstruction has demonstrated ad infinitum that when pushed to produce the ground upon which appeals to truth must rest, texts enter an endless chain of diffèrance. The concern is no longer with demonstrating the vulnerability of meaning to chaos and nonsense or the non-exclusivity of any medium that claims intimacy with objects, but with the coded control of these media themselves. One of the most crucial aspect of this, however, is the fact that the human body is also understood as coded. There is no essential ontological difference between subject and object, human and animal, animal and machine. As Donna Haraway declares:

"By the late twentieth century, our time, a mythic time, we are all chimeras, theorized and fabricated hybrids of machine and organism; in short, we are cyborgs. The cyborg is our ontology; it gives us our politics. The cyborg is a condensed image of both imagination and material reality, the two joined centres structuring any possibility of historical transformation."[8]

The move from 1900 to DN2K is a decisive change in the behavior of our discourse function, and not merely an intensification of degree. The separation of imagination and reality is assumed by the discourse network of 1900, the move is to control the resulting framework. Haraway argues that the "communications sciences and biology are constructions of natural-technical objects of knowledge in which the difference between machine and organism is thoroughly blurred" and that the consequences of this are "fundamental transformations in the structure of the world."

If indeed the "cybernetization" of every aspect of daily life has made cyborgs of what was formerly "human," if the cyborg is a new "self" which Haraway describes as "a kind of disassembled and reassembled postmodern collective and personal self," then when she asserts that "this is the self feminists must code," we must go further. This is the battle for hypercode, and in a very real sense it is a battle for daily existence, even for sanity. Anyone who wishes to survive must learn to code, anyone who wishes to avoid being part of a system of cybernetic control must learn to be a cyborg programmer.

It is critical that statements concerning the inevitability of a fundamental shift in our meaning generating systems not be taken as statements of unbridled enthusiasm. Techno-lust and Nature-nostalgia both will have their playgrounds in DN2K, but neither will have control. Haraway is concerned with the abuses of this new network, holding C^3I to be the epitome of what she calls the "informatics of domination." Baudrillard is nothing short of obsessed with new forms of domination, at times almost appearing nostalgic for a form of domination that he can sink his marxist teeth into. Certainly, domination of a new sort is a very real threat. Since we will be shaped by our new technologies in every aspect of daily life, how we resist and react to DN2K will have an impact on how we live and on what shape the function takes from here.

As cyberspace and virtual reality move from the speculative pages of our fiction to the functional spaces of our desktops, those who wish to move from "the business of reading books"[9] to the business of mapping the new epistemological space will become cyborg programmers. Those who want to move away from the business of atomizing "knowledge" and storing it in the piggybanks of children's minds[10] will become cyborg programmers. If indeed the "boundary is permeable between tool and myth, instrument and concept, historical systems of social relations and historical anatomies of possible bodies, including objects of knowledge," hypercode is the principle of permeability and the route to a re-enchantment with the technologies that support and constitute our epistemological relation to the world.

In attempting to find some kind of route to a productive or "enchanted" relation with these new technologies, it is useful to examine two non-functional responses to our changing epistemological terrain. The first, I have termed "the nostalgic response." This is the response of those who are concerned not so much with the "green" concerns involving our natural world, but the perception that the distance between this "natural" space and our epistemological space is increasing. This type of response is typically accompanied by the notion that the stuff of television is just so much "fluff." The second is the information overflow response, well typified by Baudrillard. The anxious response fears the unbridled overflowing of information's traditional boundaries as it becomes increasingly impossible to display a kind of categorical, let alone a pan-categorical, mastery of the world's knowledge.

Bill McKibben in _The Age of Missing Information_ describes an experiment in which During one twenty-four hour period he videotaped everything that was broadcast on television. He compared this to a day spent in the woods:

"One day, May 3, 1990, lasted well more than a thousand hours--I collected on videotape nearly every minute of television that came across the enormous Fairfax cable system from one morning to the next, and then I watched it all. The other day, later that summer, lasted the conventional twenty-four hours. A mile from my house, camped on a mountain top by a small pond[...]"[11]

McKibben's claim throughout the book is simple: we think we live in an age of information, in which television is the privileged informer, yet we are more ignorant than ever and losing vital "information" concerning our "natural" environment. It is the loss of this "natural" space that has contributed to the ignorance of "vital knowledge that humans have always possessed about who we are and where we live." He calls this the "Unenlightenment."[12] His book is both a call to the attempt to re-establish some communion with our eroding natural environment, and a mourning for the loss of this possibility.

McKibben's experiment is useful in two ways. It shows that once a critical point in the network function has been passed, there is no going back. Critical points mark the transition from one system to another. This transition is what Ilya Prigogine calls an "irreversible process."[13] The network which defines "man" against "nature" is no longer a viable meaning generating system. It also demonstrates that television is not the best model for our new network.

This second observation seems banal, but is critical. Perhaps due to McLuhan's claims for television in Understanding Media, many critics of popular culture and media studies have followed suit, holding television to be the model for a new information age. Almost all of these critics, of course, decry the loss of "valuable" information, usually more subtly than does McKibben, but almost without fail in comparable ways.


In the opening passages of "The Ecstasy of Communication," Jean Baudrillard quite correctly notes that there has been a change in the function of communication; a new way in which new technologies mediate lived relations, a new constellation of events in which "[t]he oppositions subject/object and public/private" do not have any significant meaning. He ends his article in a tone of almost despondent resolution:

"In any case, we will have to suffer this new state of things, this forced extroversion of all interiority, this forced injection of all exteriority that the categorical imperative of communication literally signifies.[...] No more hysteria, no more projective paranoia, properly speaking, but this state of terror proper to the schizophrenic: too great a proximity of everything, the unclean promiscuity of everything which touches, invests and penetrates without resistance, with no halo of private protection, not even his own body, to protect him anymore."[14]

To a great extent, this passage can be read as the paranoid suffering "the pathology of organization," himself suffering the "terror" of increasing schizophrenia of instant communication. In the hypercodic, there is no convincing way to think about information in absolutist terms of "truth." To attempt to decode the blood-flow of information, coming at increasingly higher bandwidths, in these terms, to attempt to function in DN2K with the techniques of mastery applied to textual space, will result in the panic that precedes the aneurysm. Self-help books appear like road signs along the Information Highway claiming to help travelers suffering from "infoglut," "information overload," and "information anxiety."

Baudrillard himself professes to this kind of anxiety:

"In terms a little different for each medium, this is the result: a space, that of the FM band, is found to be saturated, the stations overlap and mix together (to the point that sometimes it no longer communicates at all). Something that was free by virtue of space is no longer. Speech is free perhaps, but I am less free than before: I no longer succeed in knowing what I want, the space is so saturated, the pressure so great from all who want to make themselves heard."[15]

Baudrillard no longer finds the knowledge for which he is looking. He is paddling against the flow of information, occasionally dipping a bucket into the stream, agonizing over the fact that it does not contain the molecules of data that he wanted. Part of the problem is that like McKibben, he takes television to be the model technical medium of this network, considering it to be "the ultimate and perfect object for this new era." It is precisely those still living in the subject/object, human/animal, animal/machine network who will view television as the ultimate in their communicational lives; they will suffer the schizophrenia of five hundred channels, unable to find anything to hold their attention for more than the time it takes to find the remote control. The cyborg will get only one channel, with exactly what it wants showing, and only when it wants to indulge in the one-to-many communications that typify the discourse network of 1900.


With the cyborg, it is not a question of Man using Tools, nor even a question of Man adapting to the Machines; it is not about adaptation at all. Adaptation is an evolutionary change of an organism in response to its environment. It is no longer a question of organism in environment it is a question of interacting subsystems. Codes control the body, codes control the surroundings, and codes control the interaction. If you want change, adaptation is not the appropriate response, modification is. If you want change, you modify your code, that of the other subsystem that you may still think of as environment, or the way your subsystem interacts with it. Better still you tweak all three till the system functions as desired.

This view of programming has to be one in which chaos, noise, and interference are seen not just as the backdrop against which order, sense, and control play, as in the discourse network 1900, but as desirable and inevitable. Baudrillard understands the importance of programming, of coding, in the realm of simulation, but shows no awareness of a programming that would allow for the uncontrolled:

"Surely this must mean the end of the theater of representation--the space of signs, their conflict and their silence. All this is replaced by the black box of the code, the molecular signal emitter with which we are irradiated. Our bodies are crisscrossed by question/answer formulas and tests, like programs inscribed in our cells. Bodily cells, electronic cells, party cells, microbiological cells: we are always on the lookout for the tiniest, indivisible element, whose organic synthesis arises form the givens of the code. But the code itself is only a genetic, generative cell where myriad intersections produce all the questions and all the possible solutions.[...] This is the space of an unprecedented linearity and one-dimensionality: a cellular space for the indefinite generation of the same signals, like the ticks of a prisoner driven mad by loneliness and repetition."[16]

For Baudrillard, the coded/programmatic is nothing but a hermetic system of self-generation, creating endless permutations of the same. He argues that though this brings about the end of determinism, code ends up being a deterministic system anyway, with questions and answers all preprogrammed. Even worse, now that everything from natural systems to biological systems are all seen as code, the programmatic space has brought an end to diversity. He sees the codic as generation of ontologically undiversified and undifferentiatable signals, endlessly interchangeable and never original. Baudrillard himself uses the phrase "the black box of the code," and indeed, this phrase is used correctly. For Baudrillard, code is one monolithic black box offering input at one end, output at the other, with no understanding or even access to the manipulations inside the box. The cyborg's code, on the other hand, is white box, and the cyborg is armed with full understanding of how to tweak.

Only the cyborg programmer will retain its sanity in DN2K because only the cyborg will know how to resist the temptation of the will to mastery, the need to calculate the incalculable, to consider order and predictability the norm. The cyborg will not only accept the irreversibility of systemic fluctuation, but will learn to induce it in resistance to the "informatics of domination." Inducing systemic fluctuation is not the same as controlling it, nor can it be an attempt at mastery of referents or first level codes--hypercode is the cyborg's language. Baudrillard cannot help but conclude that passivity is the only resistive response given his misapprehension of code. The McKibbens of the world will suffer info-aneurysms as they attempt to cull their human information from the data-chaff. The cyborg knows, however, that impact, not truth, must be the operating principle.

We have passed the bottom limit and entered a new network. Information, even in simulation, has real impact on real bodies. The question that remains is the most difficult of them all: If it is no longer a question of being, but of becoming, how do we engineer this new cyborg-self?

For now it can be as simple as fighting against your will to information mastery. Humans and machines are already working together in ways that blur the distinction between them, but they bring different strengths and weaknesses to their coupling. Humans are not good at exactitude, at quick calculation, or at brute force searching. Machines are not good at association, at intuitive or intellectual leaps, or at abstraction. Striving daily to let each component of our new epistemology use its own strength, the machine and the human, will take us a great distance from the information anxiety and toward a re-enchantment with our knowledge.

Increasingly, however, it will be along the lines of CMC (Computer Mediated Communication) tools, further blurring the distinctions between humans and machines. Whatever else USENET is, however inane many of the posts have become or always were, it represents a truly remarkable thing: one person can use a machine to compose and post a query to a potential audience of 10 million readers. These readers can in turn use their machines to search for and post their answer for the original poster to read. This represents a machine-aided *human computation* using the brains of 10 million people to search for a response to a question. What is it, though, about a certain post that prods one to take the time to respond, even actively search for an answer in an attempt to help someone whom one does not and will not ever know personally? What, really, is the difference between hacking the human and hacking the box? It is all in the hypercode.


1. Forward to Friedrich Kittler, _Discourse Networks 1800/1900_ (Stanford, CA: Stanford University Press, 1990), pxii. [Originally published in German in 1985 as Aufschreibesysteme 1800/1900.

2. See for an example of this style of probe, Roland Barthes, _Mythologies_ (Editions de Seuil, 1957).

3. Jean Baudrillard, "Fatal Strategies," _Selected Writings_ (Stanford, CA: Stanford University Press, 1988) 188. See Martin Heidegger, The Question Concerning Technology and Other Essays (New York: Harper and Row, 1977) 3-35. [Translated by William Lovitt].

4. Michel Foucault disparages this possibility: "[...]il ne nous est pas possible de dècrire notre propre archive, puisque c'est 'intèrieur de ses rëgles que nous parlons, puisque c'est elle qui donne ce que nous pouvons dire--et elle-míme, objet de notre discours--ses modes d'apparition, ses formes d'existence et de coexistence, son systëme de cumul, d'historicitè et de disparition" (_L'Archèologie du Savoir_ [Paris: Editions Gallimard, 1966] 171). I find his reasoning compelling in reference to his own methodology, but not perhaps, to Kittler's.

5. Kittler, _Discourse Networks 1800/1900_, 370.

6. Ibid., 282.

7. Ibid., 316 (citation of Rilke, "Primal Sound," 53).

8. Seymour Papert, _The Children's Machine_ (HarperCollins, 1992), 150.

9. "Many areas of endeavor in America pressured by technological change have already had to decide what business they were really in, and those making the narrow choice have not usually fared well. A fascinating instance of this choice is now taking place in the piano industry. Steinway used to own the market, and it has decided to stay in the piano business. Yamaha decided it was in the keyboard business--acoustic and electronic--and has, with Roland, Korg, and other manufacturers, redefined the instrument. Time has yet to tell who will win, financially or musically. For all its fastidious self-distancing from the world of affairs, literary study faces the same kind of decision. If we are not in the codex book business, what business are we really in?" (Richard Lanham, "The Electronic Word: Literary Study and the Digital Revolution," _New Literary History_ (Winter 1989) 20:2:270.

10. See Papert, _The Children's Machine_.

11. Bill McKibben, _The Age of Missing Information_ (New York: Random House, 1993), 9.

12. Ibid.

13. Ilya Prigogine and Isabelle Stengers, _Order Out of Chaos: Man's New Dialogue with Nature_ (New York: Bantam Books, 1984). [Originally published in French under the title _La Nouvelle Alliance_.] See in particular pages 268-294.

14. Baudrillard, _Selected Writings_, 132.

15. Ibid., 131-2.

16. Ibid., 140.