" Our tissues change as we live: the food we eat and the air we breathe become flesh of our flesh and bone of our bone, and the momentary elements of our flesh and bone pass out of our body every day with our excreta. We are but whirlpools in a river of ever-flowing water. We are not stuff that abides but patterns that endure.” -Norbert Weiner, 1950 [1]

“If the work of the city is the remaking or translating of man into a more suitable form than his nomadic ancestors achieved, the might not our current translation of our entire lives into the spiritual form of information seen to make of the entire globe, and of the human family, a single consciousness?” -Marshall McLuhan, 1964 [2]

The term cybernetics was coined by American mathematician Norbert Weiner in the 1940's to denote “the entire field of control and communication theory , whether in the machine or in the animal”3. The concept arose out of his work on problems with gunfire control and automatic missile guidance for the American war effort in World War II, Weiner saw the control systems used in these devices not as a series of interlocking mechanical processes, but rather as a continuous flow of information [link]. He derived the term from the Ancient Greek word kybernetes, meaning steersman or helmsman4. “Governor” also derives from the same root, being the helmsman of a state. The etymology implies, then, not just the study of how information flows, but specifically how it is used to control systems, whether mechanical, biological, or social. By coincidence and unknown to Weiner, the term had also been coined by Ampère in the early part of the nineteenth century to denote political science, and by a Polish scientist in another context around the same time. However, Weiner's use of the term has endured while the others have faded from the cultural lexicon. True to the nature of its subject, the term itself is rather amorphous, not describing a set, specific discipline, but rather encompassing a complex network of theories and ideas including “the study of language, ... the study of messages as a means of controlling machinery and society, the development of computing machines and other such automata, certain reflections upon psychology and the nervous system, and a tentative new theory of scientific method.”5

Up to the present there have been three major periods in the study of cybernetics. From the field's inception in the mid 1940's through the 1960's the focus remained on the concept of homeostasis. The broad idea is that systems can maintain internal balance by adapting to the external conditions of their environment. In intense heat, a person sweats. The mechanism by which this happens is called feedback—when a system changes the way it operates based on input from the enviroment which it acts on. Feedback mechanisms had been well understood from the mid 19th century onwards because of their central role in the self-regulation of steam engines. The marriage of feedback with a nascent information theory is what led to the inception of homeostasis—the idea that any system, mechanical, biological, or social, can maintain internal balance against its environment. Homeostasis was the central construct by which cybernetics was understood for its first wave.

Starting in 1960, however, this theory was modified to the more powerful, pervasive, and subversive concept of reflexivity, which was defined as “the movement whereby that which has been used to generate a system is made, through a changed perspective, to become part of the system it generates.”6 Systems re-entangle with themselves, and become referential to themselves. More than just feedback, wherein systems reprocess their own output, reflexivity implies a stronger sense of self-referentiality and self-awareness. For mathematicians this meant that statements of number theory could also become statements about number theory; a reflexive system can model itself. The danger and power of reflexivity lies in the fact that it blurs the traditionally accepted borders imposed on the world between subject and object, object and environment.

This blurring of borders was extended even further starting in the mid 1980's with the third wave of cybernetics, which has been centered around the idea of virtuality. Though the common perception of the word relates it to advanced, virtual reality technologies, the term actually encompasses the much larger idea that material reality is "interpenetrated by informational patterns." [7] When a person wearing a VR glove moves their hand and sees their virtual hand move accordingly, the implication is that a flow of information lies within the material substrate of their body, which can be translated seamlessly into a computer simulation. At the heart of virtuality is this duality between materiality and disembodiment. The world exists simultaneously as a physical system and as an information system, ostensibly separate, but inseparably intertwined. Cybernetics was constructed as the study of the latter in isolation, but the introduction of virtuality to the field's vocabulary has brought up questions of how the two are interrelated. Currently the field has given strong prevalence to information over materiality, leading to a culture of disembodiment—where the virtual is just if not more "real" than reality. Though certainly a romantic prospect, it has led to strong criticisms with regards to its implications for the construction of human identity, as addressed below.

In current popular use, much like virtuality, cybernetics is most often associated specifically with the development of artificial intelligence, virtual technology, cyberspace. The idea that sets of data can be conceived of as a navigable space is derived from the term's etymological origins. However, cybernetics in the context of technology is only a limited part of a greater whole, which deals with the study of information systems and the media in which they exist, both inorganic and organic. This versatility of subject is ultimately both the discipline's greatest strength and weakness. Cybernetics does not study the particular qualities of material instantiation, but rather looks only at the level of a system's functionality—“it treats not things, but ways of behaving” [9] This approach to the study of systems parallels that of Behavioral Psychology, and similar criticisms may be raised. By reducing the focus of study in this way, both fields are given incredible power and versatility in terms of the claims they can make, but it has also been argued that they become so broad and lacking in material specificity that they become irrelevant. Cybernetics claims that the flow of information in a system can be studied independently of the media in which that information exists. As reflexivity and then virtuality were introduced to the field, it became apparent that not only can information be studied independently from its material substrate, but it can also flow between media. Further and further extension of the idea leads to a model of the world in which media serve as a series of irrelevant substrates through which pure information freely flows.

McLuhan began to deal with the implications of this translatability in Understanding Media, when he stated that after extending or translating “our central nervous system into the electromagnetic technology, it is but a further stage to transfer our consciousness to the computer world as well.” [10] Although he does not use the word, McLuhan is alluding to the cybernetic possibility of human beings interfacing and entangling with machines on a neurological and functional level, which is to say becoming cyborgs. Just as binoculars are an extension of the eye and clothes are an extension of the skin, then information technologies become McLuhan's extension of the mind. The Internet clearly serves as the next step in this process of extension, even though his work far predates its inception. By connecting all computers as part of a pervasive, global network of information, man is not only able to extend his nervous system to interface with technology, but is able to use that mediation to directly connect with the nervous systems of other human beings, also tapped into the network. McLuhan's enigmatic prediction of a single consciousness dealt precisely with this possibility.

This raises certain very strong critical issues with regards to the traditional view of liberal humanism. John Locke and Jean-Jacques Rousseau, among other Enlightenment thinkers, posited the mind as being the seat of human identity. We think, therefore, we are. The birth of information theory allowed people to theorize the human being as a self-expanding system of information. Cybernetics allowed this information to flow freely between media, not tying it down necessarily to its material substrate. This makes the human body become, as the protagonist of William Gibson's Neuromancer put it, “data made flesh.” [11] In an age of increasing focus on information technologies and the ways in which people interface with them, the boundary distinguishing an individual from their surroundings becomes blurred, if not shattered entirely. In many ways it's a further extension of the probabilistic world-view introduced by quantum mechanics. [12] As Werner Heisenberg pointed out, one can not observe a system without affecting that system. Cybernetics extends that proposition even further, as its theorists have stated that one also can not avoid being affected by that system. Through reflexivity and virtuality, cybernetics has deconstructed the illusion that systems of information can be separated from their surroundings in any meaningful way. As the postumanists have theorized, we exist simply as information systems that happen to inhabit the material instantiation of our bodies; the fact that we've been tied to these bodies, and even the fact of consciousness itself, is not a necessary quality of human existence, but rather a result of specific historical circumstances—an accident of history. [13]

Converse to the problem of humans becoming more like machines is the question also raised of machines becoming more like humans. If human identity has been reduced to an information system that happens to inhabit the body as medium, what's to say that another information system inhabiting, say, a computer, or the Internet, couldn't be perceived as being equally as “human.” This question has been a driving force for the study of artificial intelligence, and has manifested repeatedly in other forms of media, such as Orson Scott Card's Speaker for the Dead, in which a more advanced descendant of the Internet develops consciousness, Richard Powers' Galatea 2.2, in which an artificial intelligence construct being trained to pass an English Master's exam begins to question the nature of its existence, or the popular Matrix films, wherein machines advanced to the point that they overthrew their human creators to become the oppressors.

William Fulton
Winter 2007