At its peak, ancient Rome enjoyed a variety of comforts that, once lost, would take centuries to recover. This process, of course, constitutes much of the story of Western civilization. Though some knowledge didn’t survive in any useful form, some of it remained lastingly embodied. The mighty ruins of Roman aqueducts, for example, continued to stand all across the former Empire. Together they once constituted a vast water-delivery system, one of whose construction and operation it took humanity quite some time to regain a functional understanding. Today, you can learn about both in the video from ancient-history Youtuber Garrett Ryan just above.
“Greek engineers began building aqueducts as early as the sixth century BC,” says Ryan. “A stone-line channel carried spring water to archaic Athens, and Samos was served by an aqueduct that plunged through a tunnel more than one kilometer long.”
These systems developed throughout the Hellenistic era, and their Roman successors made use of “arches and hydraulic concrete, but above all it was the sheer number and scale that set them apart.” Most Roman cities had “networks of wells and cisterns” to supply drinking water; aqueducts, in large part, came as “luxuries, designed to supply baths, ornate fountains, and the houses of the élite.” Man’s taste for luxury has inspired no few of his great works.
The task of building Rome’s aqueducts was, in essence, the task of building “an artificial river flowing downhill from source to city” — over great distances using no power but gravity, and thus on a descending slope of about five to ten feet per mile. This precision engineering was made possible by the use of tools like the dioptra and chorobates, as well as an enormous amount of manpower. Roman aqueducts ran mostly underground, but more impressively in the elevated channels that have become landmarks today. “The most spectacular example is undoubtedly the Pont du Gard, located just outside Nîmes,” says Ryan, and TV traveler Rick Steves visits it in the clip above. What once served as infrastructure for the well-watered mansions of the wealthy and connected now makes for a fine picnicking spot.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Attempts to broadcast color TV wouldn’t be made until the 1950s, with the first commercial broadcast made by CBS airing in 1951 on five stations. Hardly anyone could see it. When NBC broadcast the Tournament of Roses Parade in 1954, fewer than 8,500 American households owned a color TV set. By April 1961, an editorial in Television magazine argued that color “is still in the egg, and only skillful and expensive handling will get it out of the egg and on its feet.” Needless to say, the adoption of the new technology was exceedingly slow.
Ratings wars and advertising wars forced color to come of age in the mid-60s, and as a result “color TV transformed the way Americans saw the world, writes historian Susan Murray at Smithsonian, as well as the way “the world saw America.” Color television “was, in fact, often discussed by its proponents as an ideal form of American postwar consumer vision: a way of seeing the world (and all of its brightly hued goods) in a spectacular form of ‘living color.’” Color was explicitly talked up as spectacle, though sold to consumers as a truer representation of reality.
“Network executives pitched [color TV] to advertisers as a unique medium that would inspire attentiveness and emotional engagement,” writes Murray, “making [viewers] more likely to purchase advertised products, a growing myriad of consumer goods and appliances that were now available in a wider set of vibrant colors like turquoise and pink flamingo.” (Thanks, of course, to the advent of space-age polymers.) Such history provides us with more context for the puzzlement of newsman Bob Bruner in 1967 (above), introducing viewers to Iowa’s Channel 2 switch-over to color.
“I feel doubly honored to have been chosen to be the first one involved in our big change,” says Bruner after chatting with station manager Doug Grant, “because there are so many much more colorful characters around here than this report in the news.” That year, there were characters like Pink Floyd appearing for the first time on American Bandstand (see that footage colorized here), their psychedelic vibrancy muted in monochrome.
Bruner had already been upstaged nearly ten years earlier, when NBC’s WRC-TV in Washington, DC introduced its first color broadcast with President Dwight D. Eisenhower, who extolls the virtues of the medium above, in the oldest surviving color videotape recording. Even so, only around 25% of American households owned a color TV in 1967. It would be another decade before every American household (or every “consumer household”) had one, and not until the mid-80s until the medium reached full saturation around the globe.
In recent decades, a medieval Persian word has come to prominence in English and other major world languages. Many of use it on a daily basis, often while regarding the concept to which it refers as essentially mysterious. The word is algorithm, whose roots go back to the ninth century in modern-day Greater Iran. There lived a polymath by the name of Muhammad ibn Musa al-Khwarizmi, whom we now remember for his achievements in geography, astronomy, and mathematics. In that last field, he was the first to define the principles of “reducing” and “balancing” equations, a subject all of us came to know in school as algebra (a name itself descended from the Arabic al-jabr, or “completion”).
Today, a good few of us have come to resent algorithms even more than algebra. This is perhaps because algorithms are most popularly associated with the deep, unseen workings of the internet, a system with ever increasing influence over the things we do, the information we receive, and even the people with whom we associate.
Provided sufficient data about us and the lives we lead, so we’re given to understand, these algorithms can make better decisions for us than we can make for ourselves. But what exactly are they? You can get one answer from “Why Algorithms Are Called Algorithms,” the BBC Ideas video at the top of the post.
For Western civilization, al-Khwarizmi’s most important book was Concerning the Hindu Art of Reckoning, which was translated into Latin three centuries after its composition. Al-Khwarizmi’s Latinized name “Algoritmi” gave rise to the word algorismus, which at first referred to the decimal number system and much later came to mean “a set of step-by-step rules for solving a problem.” It was Enigma codebreaker Alan Turing who “worked out how, in theory, a machine could follow algorithmic instructions and solve complex mathematics. This was the birth of the computer age.” Now, much further into the computer age, algorithms “are helping us to get from A to B, driving internet searches, making recommendations of things for us to buy, watch, or share.”
The algorithm giveth, but the algorithm also taketh away — or so it sometimes feels as we make our way deeper into the twenty-first century. In the other BBC Ideas video just above, Jon Stroud makes an investigation into both the nature and the current uses of this mathematical concept. The essential job of an algorithm, as the experts explain to him, is that of processing data, these days often in large quantities and of various kinds, and increasingly with the aid of sophisticated machine-learning processes. In making or influencing choices humans would once have handled themselves, algorithms do present a risk of “de-skilling” as we come to rely on their services. We all occasionally feel gratitude for the blessings those services send our way, just as we all occasionally blame them for our dissatisfactions — making the algorithm, in other words, into a thoroughly modern deity.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Much in Ukraine has been lost since the Russian invasion commenced this past February. But efforts to minimize the damage have been responding on all fronts, and not just geographical ones. The preservation of Ukrainian culture has become the top priority for some groups, in response to Russian forces’ seeming intent to destroy it. “Cultural heritage is not only impacted, but in many ways it’s implicated in and central to armed conflict,” says Hayden Bassett, director of the Virginia Museum of Natural History’s Cultural Heritage Monitoring Lab, in the Vox explainer above. “These are things that people point to that are unifying factors for their society. They are tangible reflections of their society.”
This very quality made them a sadly appealing target for Russian attacks. As the video’s narrator puts it, Vladimir Putin “has made it clear that identity is at the ideological center of Russia’s invasion,” ostensibly an effort to reunify two lands of a common civilization. For Ukraine, the strategy to protect its own cultural heritage during wartime involves two phases of work.
Step two is to secure these cultural treasures, whether they be paintings, sculptures, buildings, or anything else besides. This requires the collaboration of “government agencies, militaries, NGOs, academics, museum institutions,” says Bassett, as well as of volunteers on the ground physically safeguarding the artifacts. This often involves hiding them whenever possible, and “if history is any indication,” says the narrator, “collections have moved underground or outside of major cities, or outside the country entirely.” So it was in Europe under the marauding of Nazi Germany, including, as seen in the France 24 segment above, with holdings of the Louvre up to and including the Mona Lisa. The state of world geopolitics today may have us wondering if we’ve truly learned the lessons of the Second World War, but at least the fight to save Ukrainian culture reminds that we haven’t forgotten them all.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
You remember it — one of the most heartbreaking scenes on TV. A man longs for nothing more than time to read, to be free of all those people Sartre told us make our hells. Finally granted his wish by the H‑Bomb, he then accidentally break his glasses, rendering himself unable make out a word. Oh, cruel irony! Not an optometrist or optician in sight! Surely, there are “Time Enough at Last” jokes at eye care conventions worldwide.
Morality tales wrapped in science fiction might make us think about all sorts of things, but one of the most obvious questions when we witness the fate of Mr. Henry Bemis, “charter member in the fraternity of dreamers,” might be, but what did people do before corrective lenses? Were millions forced to accept his fate, living out their lives with farsightedness, nearsightedness, and other defects that impede vision? How did early humans survive in times much less hospitable to disabilities? At least there were others to read and describe things for them.…
In truth, the Twilight Zone is not far off the mark. Or at least nearsightedness and reading are closely linked. “As long as primates have been around, there’s probably been myopia,” says professor of ophthalmology Ivan Schwab. But Schwab argues in his book Evolution’s Witness: How Eyes Evolved that the rise of reading likely caused skyrocketing rates of myopia over the past three hundred years. “Though genes and nutrition may play a role in nearsightedness,” Natalie Jacewicz writes at NPR, “[Schwab] says education and myopia seem to be linked, suggesting that when people do a lot of close work, their eyes grow longer.”
As the History Dose video above explains, the oldest image of a pair of glasses dates from a 1351 painting of Cardinal Hugh of Saint-Cher. The painting is an anachronism — spectacles, the narrator tells us, were invented 23 years earlier in Pisa, after the cardinal’s death. They “gradually spread across Europe and travelled the Silk Road to China.” (The oldest surviving pair of glasses dates from around 1475). So what happened before 1286? As you’ll learn, glasses were not the only way to enlarge small items. In fact, humans have been using some form of magnifying lens to read small print (or manuscript or cuneiform or what-have-you) for thousands of years. Those lenses, however, corrected presbyopia, or far-sightedness.
Those with myopia were mostly out of luck until the invention of sophisticated lens-grinding techniques and improved vision tests. But for most of human history, unless you were a sailor or a soldier, you “likely spent your day as an artisan, smith, or farm worker,” occupations where distance vision didn’t matter as much. In fact, artisans like medieval scribes and illuminators, says Neil Handley — museum curator of the College of Optometrists, London — were “actually encouraged to remain in their myopic condition, because it was actually ideal for them doing this job.”
It wasn’t until well after the time of Gutenberg that wearing lenses on one’s face became a thing — and hardly a popular thing at first. Early glasses were held up to the eyes, not worn. They were heavy, thick, and fragile. In the 15th century, “because… they were unusual and rare,” says Handley, “they were seen as having magical powers” and their wearers viewed as “in league with the devil, immoral.” That stigma went away, even if glasses picked up other associations that sometimes made their users the subject of taunts. But by the nineteenth century, glasses were common around the world.
Given that we all spend most of our time interacting with small text and images on handheld screens, it seems maybe they haven’t spread widely enough. “More than a billion, and maybe as many as 2.5 billion, people in the world need but don’t have glasses to correct for various vision impairments,” notes Livescience, citing figures from The New York Times. For many people, especially in the developing world, the question of how to get by in the world without eyeglasses is still a very pressing, present-day issue.
When I lived in Los Angeles, I enjoyed no breakfast spot more than Pann’s. The place had it all: not just signature plates ranging from biscuits and gravy to chicken and waffles, but tropical landscaping, stone walls, a slanted roof, banquettes in burgundy and counter seats in cream, and as the pièce de résistance, a neon sign that lit up one letter at a time. Built in 1958, Pann’s stands today as quite possibly the most immaculate surviving example of Googie, a mid-twentieth-century aesthetic that takes its name from another Los Angeles coffee shop opened nearly a decade earlier. Though designed by no less serious a modern architect than Frank Lloyd Wright protégé John Lautner, Googie’s gave rise to perhaps the least serious of all architectural movements.
“It’s a style built on exaggeration; on dramatic angles; on plastic and steel and neon and wide-eyed technological optimism,” writes Matt Novak at Smithsonian magazine. “It draws inspiration from Space Age ideals and rocketship dreams. We find Googie at the 1964 New York World’s Fair, the Space Needle in Seattle, the mid-century design of Disneyland’s Tomorrowland, in Arthur Radebaugh‘s postwar illustrations, and in countless coffee shops and motels across the U.S.”
But the acknowledged cradle of Googie is Los Angeles, whose explosive development alongside that of mid-twentieth-century American “car culture” encouraged the ultra-commercial architectural experimentation whose first priority was to catch the eye of the motorist — and ideally, the hungry motorist.
You can hear the history of Googie told in the Cheddar Explain video “How Los Angeles Got Its Iconic Architecture Style,” which adapts Novak’s Smithsonian piece. In “Googie Architecture: From Diners to Donuts,” photographer Ahok Sinha goes into more detail about how the style turned “architecture into a form of advertising.” Like all the most effective advertising, Googie drew from the zeitgeist, incorporating the striking shapes and advanced materials connected in the public mind with notions of speed and technology embodied not just by automobiles but even more so by rockets. For Googie was the architecture of the Space Race: it’s no accident that the creators of The Jetsons, which aired in 1962 and 1963, rendered all the show’s settings in the same style.
It could fairly be said that no one architect invented Googie, that it emerged almost spontaneously as a product of American popular culture. But “for some reason, we got stuck with the name,” says architect Victor Newlove, of Armet Davis Newlove and Associates, in the interview clip above. For good reason, perhaps: to that firm’s credit are several locations of the diner chains Bob’s Big Boy (where for years David Lynch’s took his daily milkshake) and Norms, both of which are still in business in Los Angeles today. Its architects Eldon Davis and Helen Liu Fong also designed Pann’s, which for many Googie enthusiasts remains an unsurpassable achievement — and one whose competition, since the moon landing and the end it put to not just the Space Race but the sensibility it inspired, has been dwindling one demolition at a time.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
Prince left us a vast body of work, with much rumored still to be awaiting release in his vault. But among his many albums already available, I still hold in especially high regard For You, the debut he recorded while still a teenager. Not only did he put out this first LP at an unusually young age, he produced it and played nearly all its instruments. Though Prince seemed to have emerged into the world as a fully formed pop-music genius, he had to come from somewhere. Indeed, he came from Minneapolis, a city with which he remained associated all his life. Now, nearly six years after his death, a Minneapolis television station has discovered a previously unknown artifact of the Purple One’s adolescence.
In April 1970 the teachers of Minneapolis’ public schools went on strike, and a reporter on the scene asked a crowd of nearby schoolchildren whether they were in favor of the picketing. “Yup,” replies a particularly small one who’d been jumping to catch the camera’s attention. “I think they should get a better education, too.”
Not only that, “they should get some more money ’cause they be workin’ extra hours for us and all that stuff.” None of this was audible to the producer at WCCO TV, a Minneapolis-native Prince fan, who’d brought the half-century-old footage out of the archive in order to contextualize another teachers strike just last month. But in the young interviewee’s face and mannerisms he saw not just a local boy, but one particular local boy made enormously good.
No one who’s seen Prince in action early in his career could fail to recognize him in this long-unseen footage. But it took more than fans to confirm his identity, as you can see in the WCCO news broadcast and behind-the-scenes segment here. A local Prince historian could provide highly similar photographs of the star-to-be in the same year, when he would have been eleven. Eventually the investigation turned up a childhood neighbor and former bandmate named Terry Jackson, who watches the clip and breaks at once into laughter and tears of recognition. “That’s Skipper!” Jackson cries, using the nickname by which his family and friends once knew him. “I never referred to him as Prince. He might even have got mad at me when he got famous.” Ascend to the pantheon of pop music, it seems, and you still can’t quite make it out of the old neighborhood.
Based in Seoul, Colin Marshall writes and broadcasts on cities, language, and culture. His projects include the Substack newsletterBooks on Cities, the book The Stateless City: a Walk through 21st-Century Los Angeles and the video series The City in Cinema. Follow him on Twitter at @colinmarshall or on Facebook.
What does it take to wear an ancient Roman toga with dignity and grace?
Judging from the above demonstration by Dr Mary Harlow, Associate Professor of Ancient History at the University of Leicester, a couple of helpers, who, in the first century CE, would have invariably been enslaved, and thus ineligible for togas of their own.
The iconic outer garments, traditionally made of wool, begin as single, 12–16m lengths of fabric.
Extra hands were needed to keep the cloth from dragging on the dirty floor while the wearer was being wrapped, to secure the garment with additional pleats and tucks, and to create the pouch-like umbo at chest level, in a manner as aesthetically pleasing as every other fold and drape was expected to be.
As formal citizen’s garb, the toga was suitable for virtually every public occasion, as well as an audience with the emperor.
In addition to slaves, the toga was off-limits to foreigners, freedmen, and, with the notable exception of adulteresses and prostitutes, women.
Wealthier individuals flaunted their status by accenting their outfit with stripes of Tyrian Purple.
The BBC reports that dying even a single small swatch of fabric this shade “took tens of thousands of desiccated hypobranchial glands wrenched from the calcified coils of spiny murex sea snails” and that thus dyed, the fibers “retained the stench of the invertebrate’s marine excretions.”
Achieving that Tyrian Purple hue was “a very smelly process,” Dr. Harlow confirms, “but if you could retain a little bit of that fishy smell in your final garment, it would show your colleagues that you could afford the best.”
The students also share how toga-clad Romans dealt with stairs, and introduce viewers to 5 forms of toga:
Toga Virilis — the toga of manhood
Toga Praetexta — the pre-toga of manhood toga
Toga Pulla — a dark mourning toga
Toga Candida- a chalk whitened toga sported by those running for office
Toga Picta- to be worn by generals, praetors celebrating games and consuls. The emperor’s toga picta was dyed purple. Uh-oh.
Their youthful enthusiasm for antiquity is rousing, though Quintilian, the first century CE educator and expert in rhetoric might have had some thoughts on their clownish antics.
He certainly had a lot of thoughts about togas, which he shared in his instructive masterwork, Institutio Oratoria:
The toga itself should, in my opinion, be round, and cut to fit, otherwise there are a number of ways in which it may be unshapely. Its front edge should by preference reach to the middle of the shin, while the back should be higher in proportion as the girdle is higher
behind than in front. The fold is most becoming, if it fall to a point a little above the lower edge of the tunic, and should certainly never fall below it. The other fold which passes obliquely like a belt under the right shoulder and over the left, should neither be too tight nor too loose. The portion of the toga which is last to be arranged should fall rather low, since it will sit better thus and be kept in its place. A portion of the tunic also should be drawn back in order that it may not fall over the arm when we are pleading, and the fold should be thrown over the shoulder, while it will not be unbecoming if the edge be turned back. On the other hand, we should not cover the shoulder and the whole of the throat, otherwise our dress will be unduly narrowed and will lose the impressive effect produced by breadth at the chest. The left arm should only be raised so far as to form a right angle at the elbow, while the edge of the toga should fall in equal lengths on either side.
Quintillian was willing to let some of his high standards slide if the wearer’s toga had been untidied by the heat of rousing oration:
When, however, our speech draws near its close, more especially if fortune shows herself kind, practically everything is becoming; we may stream with sweat, show signs of fatigue, and let our dress fall in careless disorder and the toga slip loose from us on every side…On the other hand, if the toga falls down at the beginning of our speech, or when we have only proceeded but a little way, the failure to replace it is a sign of indifference, or sloth, or sheer ignorance of the way in which clothes should be worn.
We’re pretty sure he would have frowned on classical archaeologist Shelby Brown’s experiments using a twin-size poly-blend bed sheet in advance of an early 21st-century College Night at the Getty Villa.
Prospective guests were encouraged to attend in their “best togas.”
Could it be that the party planners , envisioning a civilized night of photo booths, classical art viewing, and light refreshments in the Herculaneum-inspired Getty Villa, were so ignorant of 1978’s notorious John Belushi vehicle Animal House?
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.