The Great Conversation:
By Gabriel Blanchard
What does technology, especially with its peculiar relationship to information,
bode for our future?
The lens through which we analyzed technology last week was, largely, material: the Stone Age versus the Bronze Age, mass production versus artisanal craft, human power versus mechanical power. What got left out was something that all technologies have in common. From the simplest piece of chipped flint being used as a makeshift knife, all the way to the cell phone you may be reading this on and which is about a hundred thousand times more powerful than the computers of the Apollo 11 program—all of it is built, in part, from a resource not made of any kind of material: ideas.
Not only are all technologies the brainchild of minds, but some of them are primarily concerned with mental things as well, sometimes relating to the material world rather tenuously. Arguably the first and finest of all human technologies, language, is sadly far too complex to address here! (But we have discussed the topic here at the Journal more than once, including a three–part series on the study of signs and symbols, or semiotics.)
We can start with something in between matter and idea, like time. However confusing Einstein made it sound, time is a real thing, just like space is. Of course, most creatures don’t bother to measure it, but humans do. In fact, we like measuring time so much we decided to do it twice (which suggests that really, we should have seen Einstein coming). First, we measure the lapse between a given point on the earth’s surface lying on a straight line between the axis of the earth and the center of the sun, and the next time that same point lies on that same line: in other words, we measure the cycle of day and night. The clock is our main piece of technology used for this purpose. It also divides the cycle into sub-units, which we call hours, and which when you come to think of them are exceedingly odd. Why twelve hours in a day,* rather than ten? or four? or five? Why sixty minutes in an hour? And so on. As it happens, the sexagesimal system (counting based on the number sixty, as opposed for instance to the decimal system we use in English, based on the number ten) does have some surprising advantages: it is the smallest number that can be divided into halves, thirds, quarters, fifths, sixths, tenths, and twelfths. Such versatility can really come in handy; ask anyone who’s tried to serve a pie for dessert.
Coincidentally, the sexagesimal system also turned out to be useful for our other favorite way of measuring time: the lapse between when the sun rises at the southern extreme of its slow back-and-forth along the horizon, and the next time it rises at that same extreme (popularly called a year). The number of day-night cycles contained in that period comes to an almost perfect six sets of sixty, with a remainder of five-and-a-bit. Or, if we wanted to, we might think of it as twelve sets of thirty; and there were reasons to do this. For one thing, many ancient societies decided not to measure time just by looking directly at the sun; they often looked instead at the moon, whose cycle from full to full runs to twenty-nine and a half days. Call that alternating “moonths” of twenty-nine and thirty days, and voilà, you have yourself a lunar year that’s eleven or twelve days short of a solar year—nearly half of a “moonth.” So you get an intricate little dance between moon-time and sun-time, complex enough to be interesting but also logical enough to be useful; all you had to do was throw in a “leap month” every two or three years, to compensate for the eleven or twelve days of disagreement—which is the model still used for the Hebrew calendar.
There may be another reason this piece of technology developed in twelve-fold form. In the Fertile Crescent and most of the Mediterranean, the four seasons we’re familiar with are normal (winter, spring, summer, and autumn). However, two big civilizations with only three seasons were sitting close by: India with its monsoon climate, and Egypt with its three phases of flooding of the Nile. It may have been convenient, especially for messengers and merchants, to have a calendar that converted easily between three seasons and four!
As the preceding few paragraphs suggest, telling time was one of the tasks assigned to astronomy; and, while harder before the invention of the telescope, astronomy was in other ways easier in the ancient world, before the invention of electric lights. Perhaps we should not be entirely shocked that the first forerunners of computers go back before electricity was harnessed. To pick out just one example, the astrolabe—something like a handheld model of the solar system—could be used to tell time, deduce one’s latitude on the earth, determine the elevation of any celestial body (even if the earth were in the way!), and more. It was one of the key technologies of the Age of Discovery, especially for navigators; they inherited it from the Medievals. The Persian philosopher and mathematician Abu Rayhan al-Biruni wrote a famous treatise on the astrolabe in the early eleventh century, and he had probably gotten it one way or another from Byzantine sources. Hypatia, a famous philosopher of the late fourth century, was sometimes credited with inventing the astrolabe, but it had already been in use for centuries when she was born; in fact, the sophisticated, long-mysterious Antikythera Mechanism is thought to be a kind of astrolabe, or alternately, “an analogue computer.” It dates to around the second century before Christ, and its complexity implies that it must have stood at the end of a long line of developments.
The same line in the other direction leads, as we know already, toward digital technology—but before we pursue it, we should pause for a moment to discuss some problems with technology. The most basic problem with it, of course, is that it sometimes breaks, though this is not so much a problem with technology as a problem with living under the second law of thermodynamics. More serious problems come in with things like fuel and waste. Most kinds of fuel come in finite amounts; there are some (like solar energy) that are for practical purposes unlimited, but we either have not yet been able to harness them as well as other fuels, or at least have not bothered to do so. Petroleum, the substance from which gasoline is made, is the current favorite fuel worldwide. But petroleum is a decidedly finite resource—even a country that managed to corner the planet’s entire supply would eventually run out—and also creates dangerous, toxic waste products. That is, besides being dangerous and toxic in its own right, if it gets into water, soil, or the atmosphere; the canny reader will note that these cover “places in which things can be” almost exhaustively.
How exactly humans ought to relate to our machines also raises problems. The discussion of what rights, if any, artificial intelligences ought to enjoy has by now become a tired trope in science fiction. However, more curious puzzles persist. There is a sort of dark inversion of the artificial intelligence in the thought experiment of the philosophical zombie: a creature sophisticated enough to behave, in all respects and at all times, like an intelligent being, but which does not actually have any subjective experience, no consciousness.
Consider too the issues posed by the question of employment and automation. If humanity ever reached the point where none of us had to work, would we still insist—for no apparent reason—that people “earn their keep”? If robots take care of all necessities, is there any meaningful sense in which my neighbor, who lets a robot mow is lawn, is somehow “not pulling his weight,” whereas I, who mow my lawn (and thus arguably waste my time and energy), am pulling mine? And if freedom from work were implemented gradually, who is better off: the employed, or the idle?
But computers have long since come up. Technology as we normally use the word, the welcome of our robot overlords, and so forth. Every third week or so, it seems, we get another voice crying in the wilderness something to the effect of “The Youths Are in Crisis! Is It Phones?” The articles may be silly, but they suggest a persistent feeling that something has gone wrong with the way we relate to technology. The feeling appears to be at least as old as Frankenstein, and takes many forms. Social media in particular is often blamed for “what it is doing to our society,” meaning everything from casual rudeness to disinformation and propaganda campaigns. Yet this does not place the blame where it truly belongs, does it? Social media, if left to (or more literally on) its own devices, just sits there. We are the ones doing something to society. Technologies of every kind have changed the circumstances and results of our behavior, which merits attention. Our behavior remains ours to fix; or, of course, not. Perhaps Oppenheimer was the only person who recognized the vertigo-inducing responsibility that technology represents, when he quoted the warlike words of the Bhagavad Gītā about the atomic bomb: “Now I am become Death, the destroyer of worlds.”
Aldous Huxley, Brave New World
Isaac Asimov, I, Robot
Umberto Eco, The Name of the Rose
Christopher Tolkien and Humphrey Carpenter (eds.), The Letters of J. R. R. Tolkien
Eric Hobsbawm, The Age of Extremes: The Short Twentieth Century, 1914-1991
Tim Wu, The Attention Merchants
*And for that matter, why have hours that are always equal in length? In some ancient methods of timekeeping, one hour was not “sixty minutes” but “one-twelfth of a day,” and therefore varied with seasonal variation in day length.
**The machine Lewis here refers to is not what we call a rocket today (e.g. the Saturn V), but Robert Stephenson’s “Rocket,” the cutting edge in steam-powered engines—when it was invented, in 1829.
Gabriel Blanchard is a non-artificial intelligence who
operates serves as CLT’s editor at large. He is a proud uncle of seven human nephews, and lives in Baltimore, Earth.
Thank you for reading the Journal. If you enjoyed this piece, you might like some of our other introductions to the great ideas; check out our short series on magic or our slightly longer one on life and death, or our standalone posts on experience, quantity, and revolution. You might also like our profiles of Albert Camus, St. Bede, Friedrich Nietzsche, and other figures from our Author Bank—a group who are also the focus of our ongoing seminar series with academics from all over the country.
Published on 27th April, 2023. Author image taken from the front page of William Blake’s The Book of Urizen (1818), depicting the title character. Urizen was a character of Blake’s own creation, owing a good deal to the Demiurge of the Gnostics; he embodied law and reason, but fell into narcissism and the attempt to dominate others, fomenting oppression, wrath, and jealousy.