Technology

When used in everyday speech today, the keyword “technology” refers primarily to physical devices. Yet this usage was not common until the second half of the twentieth century. During the seventeenth century, “technology” was either a systematic study of the arts or the specific terminology of an art (Casaubon 1612; Bentham 1827; Carlyle 1858). An encyclopedia, dictionary, or publication like Keywords for American Cultural Studies would have been called a technology. Related terms such as “tool,” “instrument,” and “machine” described physical devices (Sutherland 1717; Hanway 1753). In the nineteenth century, “technology” became the practical application of science, a system of methods to execute knowledge (Horne 1825; Raymond Williams 1976/1983), or a discipline of the “Industrial Arts” focused on the use of hand and power tools to fabricate objects (G. Wilson 1855; Burton 1864). During the twentieth century, the meaning of “technology” gradually expanded to include both the processes of a system and the physical devices required of that system (D. F. Noble 1977). By midcentury, it was used as a modifier to characterize socioeconomic developments, as in the use of “high-technology” or “high-tech” to describe complex applications of specialized machines in industrialized economies.

Scholars of American studies and cultural studies working on the history of technology have emphasized its social, cultural, and economic dimensions. They have tended to resist complicity in technological determinism (technology as the sole cause of cultural change), technological instrumentalism (technology as value neutral), technological positivism (technological progress as social progress), and technological essentialism (technology as having some intrinsic nature or essence). In fact, American studies and cultural studies approaches to technology are best described as “nonessentialist.” The central premise of nonessentialism is that neither technologies nor histories of technology can be divorced from the social and cultural contexts of their production, circulation, or consumption (Ross 1990). American studies and cultural studies approaches begin with the claim that technologies can be made, interpreted, and used in multiple and often contradictory ways (Ihde 1990; Feenberg 1999; Haraway 1985). They share with “constructivist” approaches common to both fields a focus on the ways in which social conditions and meanings shape how people create, perceive, and understand technologies. But they also underscore why the technical particulars of technologies—how technologies turn this into that (Fuller 2005)—really matter (Galloway 2006; Gitelman 2006; Bogost 2007; Kirschenbaum 2008). They frequently note that a technology can articulate complex relations between actors in a given network, rendering decisions for them beyond their own knowledge or awareness (Latour 1987; Kittler 1999; Galloway 2004; Chun 2011). From a nonessentialist perspective, technologies are never simply “extensions” of human beings or human rationality (McLuhan 1964/2003). Instead, technologies exist in recursive and embodied relationships with their operators, and they must be understood through their social, cultural, economic, and technical processes, all of which are material.

In order to better understand this approach, consider a key moment in the history of technology: the Luddite rebellions that started in Nottingham, England, in 1811. Composed largely of experienced artisans in the hosiery and lace trades, the Luddites broke wide-frame looms—a new technology of the moment—because looms threatened their livelihood by automating their craft and reducing the costs of hosiery and lace production. The rebellions spread beyond Nottingham (to Derby, Yorkshire, and elsewhere) and to other industries (cotton, cropping, and wool). They ultimately failed to stop the proliferation of wide-frame looms, and their legitimacy was undermined by the Luddites’ violent attacks on magistrates, merchants, and other townspeople. Yet the rebellions are historically important because the Luddites anticipated the gradual shift from technology as “the theory and accurate description of useful arts and manufactures” (Zimmerman 1787, iii) to technology as the material application of science in industries such as textile manufacturing. To adapt a metaphor from Karl Marx (1867/1976), the Luddites understood how technology was becoming “frozen labor” or, put differently, “work and its values embedded and inscribed in transportable form” (Bowker and Star 1999, 135).

A nonessentialist approach to technologies such as wide-frame looms suggests that machines were an important factor in the shift toward “frozen labor” during the nineteenth century, but they were not its sole cause. Instead, machines represented and even enabled the social, cultural, and economic forces of industrial capitalism: the rise of factories (L. Klein 2008); the alienation, systemization, and automation of handicraft; the widespread investment in efficiency; and the decrease of human error through scientific management and standardized workflows (F. Taylor 1911/2010). Nonessentialist approaches also recognize how the implications of technology are interpreted differently across different settings and populations. For working-class Luddites, the wide-frame loom implied the deskilling of certain crafts and the eventual obsolescence of existing occupations; for engineers such as Charles Babbage (1832), it pointed toward innovation, heightened productivity, decreased costs, and increased accuracy in manufacturing. Such differing perspectives reproduced asymmetrical relations of class and power.

These class and power differences are important to remember when observing how industrialization corresponded with the formation of technology as an academic discipline during the mid-nineteenth century. At that time, the word began to appear in university names, such as the Massachusetts Institute of Technology, which opened in 1865. As a discipline, technology was associated with the humble and economically useful “Industrial Arts,” rather than the noble and aesthetically useful “Fine Arts” (G. Wilson 1855). It was also a set of technical skills possessed by an individual: “His technology consists of weaving, cutting canoes, [and] making rude weapons” (Burton 1864, 437). In many universities, such skills were deemed inferior to the mental labor of science and literature. During debates with biologist T. H. Huxley, the nineteenth-century poet and critic Matthew Arnold defined technology as mere “instrument-knowledge” (1882/1885, 107), peripheral to culture and the civilizing pursuits of spiritual and intellectual life (Mactavish and Rockwell 2006). Although Huxley and Arnold disagreed about the role that science should play in education, neither considered technology a discipline worthy of the ideal university. Weaving, cutting canoes, and making rude weapons were routines delegated to the working class, not the late nineteenth century’s educated elite.

The nineteenth-century definition of “technology” as a practical application of science persisted well into the twentieth century, especially through the proliferation of phonography, photography, cinema, radio, and other utilitarian modes of mechanical reproduction (T. Armstrong 1998). The effects of this proliferation were perceived variously across contexts, but a common question during the first half of the twentieth century was how—through technology—politics were aestheticized and aesthetics were politicized (Benjamin 1936/1968). The totalitarian regimes of fascism and Nazism aestheticized their politics through references to technological innovation. They rendered automobiles, airplanes, cameras, radios, and typewriters beautiful objects: symbols of progress, modernity, efficiency, and mastery over nature (Marinetti 1909/2006; Triumph of the Will 1935). Once aestheticized, technologies such as cinema helped mask totalitarian violence through commodity culture and mass distribution, prompting the Frankfurt school philosopher Herbert Marcuse to write, “the established technology has become an instrument of destructive politics” (1964/2002, 232).

Like the Luddites, Marcuse and other neo-Marxists were critical of the tendency to reify politics and labor through technologies and aesthetics (Horkheimer and Adorno 1944/2002; Dyer-Witheford 1999). Their response required the politicization of aesthetics through the same modes of mechanical reproduction. For instance, early cinema was used for purposes other than formalizing and disseminating totalitarian ideology. It also fostered opportunities for shared experience (in the theater), collective witnessing (of narratives, images, and audio), and better understanding of how consciousness, perception, and social relations are produced in the first place (Benjamin 1936/1968; Kracauer 1960/1997; Hansen 2011). This response prevented technology from being reduced to an instrument or agent of positivism. Rather, it positioned technology as one element in a complex system of material processes and conditions. The more practical this system appears, the more instrumental, determinist, or positivist it becomes (Postman 1993). In this sense, “practical” is nearly synonymous with a “natural,” “intuitive,” or “invisible” technology (Heidegger 1977/1993; Weiser 1991; Norman 1998).

This common affiliation of technology with practicality explains why nonessentialist approaches are central to American studies and cultural studies: they resist the tendency either to give technologies too much authority in everyday life or to relegate people to unconscious consumers, who are incapable of intervening in systems, applications, or devices of any sort (Braverman 1974/1998; D. F. Noble 1995). They also highlight the fact that technology becomes gendered, sexualized, and racialized through its naturalization or routinization. Historically, technology has been culturally coded as masculine (Wajcman 1991; Balsamo 1996; Rodgers 2010), and it has consistently served the interests of “able” bodies, prototypical whiteness, and heteropatriarchy (Haraway 1985; A. Stone 1996; Nakamura 2002, 2008; Sterne 2003; T. Foster 2005; E. Chang 2008; Browne 2010). Yet it is important to recognize that bias or supremacy is not somehow inherent to technologies or their technical particulars. It emerges from the social, cultural, and economic conditions through which technologies are articulated with interpretive processes and embodied behaviors.

In response to this recognition, some American studies and cultural studies practitioners encourage a “technoliteracy” influenced by computer hacking, technical competencies, new media production, and critical making (Wark 2004; Hertz 2009; Ratto 2011; Losh 2012; McPherson 2012a). Andrew Ross (1990) defines “technoliteracy” as “a hacker’s knowledge, capable of reskilling, and therefore of rewriting the cultural programs and reprogramming the social values that make room for new technologies” (para. 43). Technoliteracy thus complicates Matthew Arnold’s reduction of technology to mere instrument-knowledge since it refuses to draw a neat division between physical devices and social values. More important, it involves actively intervening in technologies—at the level of systems, applications, and devices—as key ingredients in the everyday production of knowledge and culture. Thus, the question for nonessentialist investments in technoliteracy is. Technology, but for whom, by whom, under what assumptions, and to what effects?

In our so-called digital age, many people would assume that interventions in technological processes are accessible to more people than ever before. After all, the Internet has been depicted as a decentralized, democratizing, and even immaterial “cyberspace” of radical freedom—a hacker’s paradise of do-it-yourself coding, performance, and publication (Gibson 1982; Barlow 1996/2001; Hayles 1999). The trouble is that proliferation should not be conflated with access or intervention. As the very word “technology” is subsumed by industry terms such as “iPad,” “Twitter,” “Droid,” and “Facebook,” not to mention the ubiquity of verbs such as “Bing,” “Skype,” and “Google” (Vaidhyanathan 2011), the values, procedures, and biases of high-technology systems, applications, and devices grow increasingly opaque or invisible to most people, who are simply deemed “users.” On the one hand, strategies for social control and regulation persist and expand through code, algorithms, metrics, protocols, and networks, which—when compiled—exceed the knowledge base of any given individual or group (Galloway 2004; Beller 2006; Chun 2006, 2011). On the other hand, scholars and users of technology are reimagining the implications of technology and technoliteracy, especially through collaboration, experimental media, and social justice initiatives (Daniel and Loyer 2007; Juhasz 2011; Anthropy 2012; Cárdenas 2012; Goldberg and Marciano 2012; Women Who Rock 2012; Cong-Huyen 2013; Lothian and Phillips 2013).

Collaborative work around technologies allows practitioners to build alternative infrastructures, tools, and projects that are difficult (if not impossible) to construct alone (Davidson 2008; Sayers 2011). Meanwhile, experimental media afford multimodal approaches to scholarly, cultural, and creative expression, anchored not only in text but also in video, audio, images, programming, and dynamic visualizations (McPherson 2009). Such expression is central to many social justice initiatives that rely on witnessing, interviews, process documentation, real-time data, intercultural dialogue, and community-based participatory action research (Ang and Pothen 2009). When blended together, collaboration, experimental media, and social justice research suggest an exciting new trajectory for American studies and cultural studies, one that invites practitioners to engage the history and future of technologies at the intersection of thinking and doing, critiquing and making, immersion and self-reflexivity.

Disciplinarities, Embodiments, Methodologies
Works Cited
Permanent Link to this Essay