In the twenty-first century, we tend to associate the word “digital” with computation, but its origins hark back to ancient times. The term derives from digitus in classical Latin, meaning “finger,” and, later, from digit, which refers both to whole numbers less than ten and to fingers or toes. Digital procedures long predate the development of electronic computers, and we might understand a number of earlier devices or systems to operate by digital principles. For instance, the abacus is a simple digital calculator dating from 300 BC, while Morse code and Braille represent more recent digital practices. What each of these examples has in common—from fingers to digital computers—is a particular use of the digital to refer to discrete elements or to separate numbers. This focus on the discrete and the separate is central to the functioning of today’s digital electronics, which, at a basic level, operate by distinguishing between two values, zero and one. While the digital predates computation, today the two terms are closely linked, and the adjective “digital” is typically a shorthand for the binary systems that underpin computation. Thus, we are living through a “digital revolution,” are at risk of an increasing “digital divide,” and are...

This essay may be found on page 91 of the printed volume.

Works Cited
Permanent Link to this Essay