My costume design professor at Northwestern University, Virgil Johnson, delighted students with his formula for period clothing. I have forgotten some of the mathematic and semantic particulars—does dressing someone five years behind the times a “frumpy” character make? Or is it merely one?
I do recall some anxious hours, preparing for the school’s main stage production of the incestuous Jacobean revenge tragedy, ’Tis Pity She’s a Whore. The societal corruption of the play was underscored by having the supporting characters slouch around, snorting mimed cocaine in cutting edge, mid-80s Vogue Patterns … those big unstructured jackets were very a la mode, but they gobbled up a lot of high-budget fabric, and I didn’t want to be the one to make a costly sewing mistake.
What sticks in my mind most clearly is that 20 years was the sweet spot, the appropriate amount of elapsed time to ensure that one would not appear dumpy, dowdy, or oblivious, but rather prudent and discerning. Donning a garment that was 15 years out of fashion might be daringly “retro,” but another five and that same garment could be heralded as “vintage.”
The collaborative Vintage Pattern Wiki puts the magic number at 25, requesting that contributors make sure the patterns they post are from 1992 or earlier, and also out-of-print.
Visitors can narrow their search to focus on a particular garment, designer or decade. If you click these links, you can see patterns from the following decades: 1920s, 1930s, 1940s, 1950s, 1960s, 1970s, and 1980s.
And it goes without saying that the dog days of summer are the perfect time to get a jump on your Halloween costume.
Those who are itching to get sewing should check the links below each pattern envelope cover for vendors who have the pattern in stock and photos and posts by community members who have made that same garment.
The prices and handwritten jottings of the original owners will also put you in a vintage mood.
We all operate at different levels of ambition: some just want to get by and enjoy themselves, while others strive to make achievements with as long-lasting an impact on humanity as possible. If we think of candidates for the latter category, Charles Darwin may well come to mind, at least in the sense that the work he did as a naturalist, and more so the theory of evolution that came out of it, has ensured that we remember his name well over a century after his death and will surely continue to do so centuries hence. But research into Darwin’s working life suggests something less than workaholism — and indeed, that he put in a fraction of the number of hours we associate with serious ambition.
“After his morning walk and breakfast, Darwin was in his study by 8 and worked a steady hour and a half,” writes Nautilus’ Alex Soojung-kim Pang. “At 9:30 he would read the morning mail and write letters. At 10:30, Darwin returned to more serious work, sometimes moving to his aviary, greenhouse, or one of several other buildings where he conducted his experiments. By noon, he would declare, ‘I’ve done a good day’s work,’ and set out on a long walk.” After this walk he would answer letters, take a nap, take another walk, go back to his study, and then have dinner with the family. Darwin typically got to bed, according to a daily schedule drawn from his son Francis’ reminiscences of his father, by 10:30.
“On this schedule he wrote 19 books, including technical volumes on climbing plants, barnacles, and other subjects,” writes Pang, and of course not failing to mention “The Origin of Species, probably the single most famous book in the history of science, and a book that still affects the way we think about nature and ourselves.” Another textually prolific Victorian Englishman named Charles, adhering to a similarly non-life-consuming work routine, managed to produce — in addition to tireless letter-writing and campaigning for social reform — hundreds of short stories and articles, five novellas, and fifteen novels including Oliver Twist, A Tale of Two Cities, and Great Expectations.
“After an early life burning the midnight oil,” writes Pang, Charles Dickens “settled into a schedule as ‘methodical or orderly’ as a ‘city clerk,’ his son Charley said. Dickens shut himself in his study from 9 until 2, with a break for lunch. Most of his novels were serialized in magazines, and Dickens was rarely more than a chapter or two ahead of the illustrators and printer. Nonetheless, after five hours, Dickens was done for the day.” Pang finds that may other successful writers have kept similarly restrained work schedules, from Anthony Trollope to Alice Munro, Somerset Maugham to Gabriel García Márquez, Saul Bellow to Stephen King. He notes similar habits in science and mathematics as well, including Henri Poincaré and G.H. Hardy.
Research by Pang and others into work habits and productivity have recently drawn a great deal of attention, pointing as it does to the question of whether we might all consider working less in order to work better. “Even if you enjoy your job and work long hours voluntarily, you’re simply more likely to make mistakes when you’re tired,” writes the Harvard Business Review’s Sarah Green Carmichael. What’s more, “work too hard and you also lose sight of the bigger picture. Research has suggested that as we burn out, we have a greater tendency to get lost in the weeds.” This discovery actually dates back to Darwin and Dickens’ 19th century: “When organized labor first compelled factory owners to limit workdays to 10 (and then eight) hours, management was surprised to discover that output actually increased – and that expensive mistakes and accidents decreased.”
This goes just as much for academics, whose workweeks, “as long as they are, are not nearly as lengthy as those on Wall Street (yet),” writes Times Higher Education’s David Matthews in a piece on the research of University of Pennsylvania professor (and ex-Goldman Sachs banker) Alexandra Michel. “Four hours a day is probably the limit for those looking to do genuinely original research, she says. In her experience, the only people who have avoided burnout and achieved some sort of balance in their lives are those sticking to this kind of schedule.” Michel finds that “because academics do not have their hours strictly defined and regulated (as manual workers do), ‘other controls take over. These controls are peer pressure.’ ” So at least we know the first step on the journey toward viable work habits: regarding the likes of Darwin and Dickens as your peers.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
We know they’re coming. The robots. To take our jobs. While humans turn on each other, find scapegoats, try to bring back the past, and ignore the future, machine intelligences replace us as quickly as their designers get them out of beta testing. We can’t exactly blame the robots. They don’t have any say in the matter. Not yet, anyway. But it’s a fait accompli say the experts. “The promise,” writes MIT Technology Review, “is that intelligent machines will be able to do every task better and more cheaply than humans. Rightly or wrongly, one industry after another is falling under its spell, even though few have benefited significantly so far.”
The question, then, is not if, but “when will artificial intelligence exceed human performance?” And some answers come from a paper called, appropriately, “When Will AI Exceed Human Performance? Evidence from AI Experts.” In this study, Katja Grace of the Future of Humanity Institute at the University of Oxford and several of her colleagues “surveyed the world’s leading researchers in artificial intelligence by asking them when they think intelligent machines will better humans in a wide range of tasks.”
You can see many of the answers plotted on the chart above. Grace and her co-authors asked 1,634 experts, and found that they “believe there is a 50% chance of AI outperforming humans in all tasks in 45 years and of automating all human jobs in 120 years.” That means all jobs: not only driving trucks, delivering by drone, running cash registers, gas stations, phone support, weather forecasts, investment banking, etc, but also performing surgery, which may happen in less than 40 years, and writing New York Times bestsellers, which may happen by 2049.
That’s right, AI may perform our cultural and intellectual labor, making art and films, writing books and essays, and creating music. Or so the experts say. Already a Japanese AI program has written a short novel, and almost won a literary prize for it. And the first milestone on the chart has already been reached; last year, Google’s AI AlphaGo beat Lee Sedol, the South Korean grandmaster of Go, the ancient Chinese game “that’s exponentially more complex than chess,” as Cade Metz writes at Wired. (Humane video game design, on the other hand, may have a ways to go yet.)
Perhaps these feats partly explain why, as Grace and the other researchers found, Asian respondents expected the rise of the machines “much sooner than North America.” Other cultural reasons surely abound—likely those same quirks that make Americans embrace creationism, climate-denial, and fearful conspiracy theories and nostalgia by the tens of millions. The future may be frightening, but we should have seen this coming. Sci-fi visionaries have warned us for decades to prepare for our technology to overtake us.
In the 1960s Alan Watts foresaw the future of automation and the almost pathological fixation we would develop for “job creation” as more and more necessary tasks fell to the robots and human labor became increasingly superfluous. (Hear him make his prediction above.) Like many a technologist and futurist today, Watts advocated for Universal Basic Income, a way of ensuring that all of us have the means to survive while we use our newly acquired free time to consciously shape the world the machines have learned to maintain for us.
What may have seemed like a Utopian idea then (though it almost became policy under Nixon), may become a necessity as AI changes the world, writes MIT, “at breakneck speed.”
Once upon a time, books served as the de facto refuge of the “physically weak” child. For animation legend, Hayao Miyazaki, above, they offered an escape from the grimmer realities of post-World War II Japan.
Many of the 50 favorites he selected for a 2010 exhibition honoring publisher Iwanami Shoten’s “Boy’s Books” series are time-tested Western classics.
And while it may be a commonly-held publishing belief that boys won’t read stories about girls, the young Miyazaki seemed to have no such bias, ranking Heidi and Laura Ingalls Wilder right alongside Tom Sawyer and Treasure Island’s pirates.
Ayun Halliday is an author, illustrator, theater maker and Chief Primatologist of the East Village Inky zine. She’ll be appearing onstage in New York City this June as one of the clowns in Paul David Young’s Faust 3. Follow her @AyunHalliday.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
We’re just days away from the 50th anniversary of the release of The Beatles’ Sgt. Pepper’s Lonely Hearts Club Band. And, as we mentioned last week, the BBC has kicked off the celebrations with a series of videos that introduce you to the 60+ figures who appeared in the cardboard collage that graced the album’s iconic cover. Bob Dylan, Edgar Allan Poe, William S. Burroughs, Albert Einstein, Marilyn Monroe, HG Wells, Shirley Temple–they all get a video introduction, among others.
Historic as it is, the Pepper cover recently became a good vehicle for remembering the bewildering number of musicians, artists and celebrities who left this mortal coil in 2016. Above you can see an illustration created by Twitter user @christhebarker in the waning days of last year. If you look closely, you can see some thought went into the design. Muhammad Ali, for example, now stands where boxer Sonny Liston did in the original. Find them all in a larger format here.
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Time is a measure of energy, a measure of motion. And we have agreed internationally on the speed of the clock. And I want you to think about clocks and watches for a moment. We are of course slaves to them. And you will notice that your watch is a circle, and that it is calibrated, and that each minute, or second, is marked by a hairline which is made as narrow as possible, as yet to be consistent with being visible.
However true, that’s a particularly stress-inducing observation from one who was known for his Zen teachings…
The pressure is ameliorated somewhat by Bob McClay’s trippy time-based animation, above, narrated by Watts. Putting Mickey Mouse on the face of Big Ben must’ve gone over well with the countercultural youth who eagerly embraced Watts’ Eastern philosophy. And the tangible evidence of real live magic markers will prove a tonic to those who came of age before animation’s digital revolution.
The short originally aired as part of the early 70’s series,The Fine Art of Goofing Off, described by one of its creators, the humorist and sound artist, Henry Jacobs, as “Sesame Street for grown-ups.”
Time preoccupied both men.
One of Jacobs’ fake commercials on The Fine Art of Goofing Off involved a pitchman exhorting viewers to stop wasting time at idle pastimes: Log a few extra golden hours at the old grindstone.
A koan-like skit featured a gramophone through which a disembodied voice endlessly asks a stuffed dog, “Can you hear me?” (Jacobs named that as a personal favorite.)
And when we think of a moment of time, when we think what we mean by the word “now”; we think of the shortest possible instant that is here and gone, because that corresponds with the hairline on the watch. And as a result of this fabulous idea, we are a people who feel that we don’t have any present, because the present is instantly vanishing — it goes so quickly. It is always becoming past. And we have the sensation, therefore, of our lives as something that is constantly flowing away from us. We are constantly losing time. And so we have a sense of urgency. Time is not to be wasted. Time is money. And so, because of the tyranny of this thing, we feel that we have a past, and we know who we are in terms of our past. Nobody can ever tell you who they are, they can only tell you who they were.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.