The age of computers has brought with it new methods of producing, storing, and receiving media. The information age has brought with it the digital revolution, of which digital media is a part. This idea of a "digital" medium necessitates an opposite notion of "analog" media, which didn't exist as a discrete class until it was named. Theorists have tried to separate (or unify, depending on one's perspective) the two (cf. Lunenfeld, 2000; Manovich, 2002), yet McLuhan had it correct first when he noted that "the 'content' of any medium is always another medium" (1994:8). The analog and digital are co-dependent; they couldn't exist without each other. However, to see how the digital and analog are inseparable, we must first see how they are unique.
The use of the word "analogue" was preceded by an earlier synonym, "analogon", which the Oxford English Dictionary traces to a combination of Greet root words ("???" and "?????") meaning "that which is according to due ratio, proportionate, conformable". The first definition given for "analogue" reads: "An analogous word or thing; a representative in different circumstances or situation; something performing a corresponding part". This meaning is rooted in the idea of an "analogy" (for which the OED provides "analogue" as synonym), which is "a name for the fact, that, the relation borne to any object by some attribute or circumstance, corresponds to the relation existing between another object and some attribute or circumstance pertaining to it". Although originally a term in mathematics (as a "proportion; agreement of ratios"), "analogy" starts appearing with the transferred sense of a general relation, often expressed proportionally, in Plato ( OED ).
The word "digital" can be traced directly to a Latin term ("digitalis") that means "of or belonging to a finger", from which comes the first definition given by the OED : "of or pertaining to a finger, or to the fingers or digits". When "digital" is specifically "applied to a computer which operates on data in the form of digits or similar discrete elements", we must note that the term "digit" also has an arithmetical meaning derived from the fact that humans have ten fingers: "Each of the numerals below ten (originally counted on the fingers), expressed in Arabic notation by one finger".
Thus by the 1940s, the terms "digital computer" and "analog computer" ("a computer which operates with numbers represented by some physically measurable quality" ( OED )) entered the language as antonyms. Analog computing had been in practice for a number of years prior to the development of the first digital machine. The difference between the two is that "instead of computing with numbers, one builds a physical model, or analog, of the system to be investigated" (Campbell-Kelly and Aspray, 1996:60; also cf. symbolic/real/imaginary ; simulation/simulacrum, (2).
Since analog computers operated by modeling a specific system, they were typically single-purpose machines not suited to accurately solving a wider array of problems. Important analog computers included Lord Kelvin's tide predictor in 1876, which charted tide tables for a specific harbor; electrical power network analyzers (such as MIT's AC Network Calculator, built in 1930), which modeled and helped design the complicated power network systems built in the United States in the 1920s; and Vannevar Bush's differential analyzer in 1931, which was able to solve a class of mathematical problems known as ordinary differential equations (ibid:61-3). The first general-purpose automatic machine that worked with real numbers ("digits") to be publicly unveiled was the Harvard Mark I, built by IBM for Howard Aiken at Harvard, completed in 1944 (ibid:69-75). The Mark I was soon surpassed by the first fully electronic digital computer, ENIAC, the following year.
Yet the term "analog computer" itself is somewhat of an anachronism in reference to the earlier machines. The word "computer" was in use as a referent to people (typically females) whose job it was to compute mathematical sums as late as World War II (ibid:66-9). Thus, after "[Alan] Turing's mathematical definition of computability in 1936 gave future computers their name" (Kittler, 1999:243), the word was as ambiguous as "typewriter", which meant "both typing machine and female typist" (ibid:183). The word "analog" was not appended until it was needed to differentiate from digital computers; this new distinction essentially created analog computers (like the ones discussed above) retroactively, all the way back to the abacus and the human hand (and digits) itself.
Turing accurately labeled the product of his research into computability a "Universal Discrete Machine" (ibid:246), as it existed in distinct states while it operated on the data. Kittler's basic description of the operation of a digital computer provides some illustration into this basic requirement (ibid:243-5; for further information, cf. Patterson and Hennessy, 1998), but all one needs to know is that all data on which a digital computer operates is merely a sequence of numbers that represent a specific point, be it spatial, temporal, or purely numerical. Hence, we come to two definitions for the term "digital data" as given by the Federal Standard Telecommunications glossary (henceforth cited as FS1037C): "1. Data represented by discrete values or conditions ... 2. Discrete representations of quantized values of variables, e.g., the representation of numbers by digits".
In contrast, "analog data" is continuous (e.g. does not exist in discrete states) and provides a model (or "analog") of the quantity being studied: "Data represented by a physical quantity that is considered to be continuously variable and has a magnitude directly proportional to the data or to a suitable function of the data" (FS1037C). In many cases, an artwork may exist both as analog data and as digital data. For example, a photograph exists on film (or the printed photographic paper) as a continuous representation of changes in hue, saturation, and lightness; however, once the same photograph has been scanned into a computer, it is composed of a certain number of data points (or pixels) that represent these values at specific intervals (e.g. 72 per inch for most images produced for the world wide web). Undoubtedly many individuals will note that the pictures look the same, but clearly they are not structurally identical.
The photograph is Goodman's paradigm of an analog medium (cf. photography). He equates the contrast between analog and digital systems to his distinction between "density" and "differentiation". This he exemplifies with the contrast between an ungraduated and a graduated thermometer: one reads the graduated thermometer as a series of sequential points, but the reading of an ungraduated thermometer is always relational and approximate - finite differentiation is impossible in a dense (or "super-dense") medium (in Mitchell, 1986:67; also cf. reality/hyperreality, (2).
This distinction was already apparent to Wittgenstein. He gives an example of a human learning how to read, and notes the difficulty in identifying the transition from not being able to read. The beginning student will be "roughly right" occasionally, he says, and will gradually improve in skill as a continuous process. Therefore, asking the question "Which was the first word that [the student] read?" makes no sense in this context, as there is no specific location of this transference in ability. However, this is not the case with a "reading machine", where the transition between states would be clear with a simple flick of a switch (1968:62-3).
Yet if we extend Frege, we can categorize the argument of analog versus digital as digital itself. When he extends the mathematical notion of a "function" to the idea of a linguistic expression as a "concept", he is essentially positing his philosophical notion of logic as digital. This notion has several tangential relations, not the least of which being that the original meanings of both "analog" and "digital" derive from mathematics, and that Goodman's exemplar of digital "differentiation" is text (cf. text; typeprint). The text arguments in Frege's linguistic "concepts" have their real world analogs (e.g. the word "Germany" refers to the actual country of Germany), yet exist on an atomical level as discrete terms. Each linguistic statement, or "concept", therefore necessarily has a "truth-value" and is either True or False [sic]; hence, "logic is concerned only with the truth of sentences containing the concept" (in Walker, 1965:12-3). This insistence on a binarization of logic is analogous to the 0/1 (or on/off, or true/false) binary dialectic that makes digital computing possible.
The above development of "analog data" versus "digital data" can be further narrowed down in order to discuss signals. An "analog signal" "has a continuous nature rather than a pulsed or discrete nature", while a "digital signal" is "a signal in which discrete steps are used to represent information" (FS1037C). This contrast of analog and digital signals is important if one is to understand recent debates on the quality of digital versus analog recording technologies among audiophiles (see below).
Thomas Edison made the first recording on a tin-foil cylinder phonograph in late 1877 and received a patent early the following year. Phonograph/gramophone recording technology works by directly translating sound waves into continuous physical grooves, either on a disc or cylinder. Electrically-recorded discs (previous machines were mechanical, with the earliest ones requiring a hand-crank) were sold in 1925, bringing with them the possibility for recording large performing groups and synchronizing sound with film. Magnetic tape recording, which simulates the physical grooves of a phonograph record via continuous magnetically alterable particles, was first demonstrated by BASF/AEG in 1935 (Schoenherr), although this development did not reach the United States until the Allied forces captured Radio Luxembourg in 1944, discovering "a new Magnetophone of extraordinary capabilities" (Kittler, 1999:106).
The digital compact disc was introduced in 1980 by Philips/Sony, and the digital audio tape was introduced by the same companies six years later. Digital recordings operate through the concept of sampling the continuous audio source at discrete points - in the case of digital compact discs, 44100 times per second (44.1 kHz). The data stored at these points are then represented by a numerical value, a process called quantization. For this reason, many audio purists argue that digital signals will never rival analog signals in terms of sound reproduction, as analog provide a more complete representation of the sound as it physically exists - similar to Wittgenstein's ungraduated thermometer, in which finite differentiation is impossible (and hence, so is representation). Others point out that while it is possible to make an exact copy of the data that comprises a digital recording (cf. mimesis; mirror ), analog copies have lower fidelity than their sources (cf. noise); also, analog recordings are more susceptible to degradation of quality over time, as phonographs can become physically worn out from repeated playing and magnetic tape (which, ironically, includes digital audio tape formats as well) can be harmed by proximity to any strong magnetic source.
Degradation notwithstanding, Hayles's discussion of virtuality and cyborgs suggests that our fetishization with digital technologies is far from over; indeed, that mankind's search to extend his senses is rooted in the idea that "material objects are interpenetrated with informational patterns" (2000:94; also cf. virtuality; cyborg). Uniting digital technologies with the analog body is merely a desire for a return to the analog qualities inherent in the digital. This was obvious to Wittgenstein, who realized that humans are searching for ways to relate machines to humans, to equate the mechanical to the physiological: "we want to know how the [machine] is supposed to be like a human being" (1968:114). No wonder, then, that the word "architecture" not only describes Yates's concept of human memory (1966; also cf. memory, (2); architecture) but also the "interface between the hardware and the lowest-level software" of a computer (Patterson and Hennessy, 1998:18). "The computer and the brain are functionally compatible", Kittler notes, which merely leaves us with the position of fusing the two (1999:249). As the original medium, memory is both continuous and discrete; it is both analog and digital. We would do well not to forget this.