Here’s something I can get pretty jazzed about. Er, maybe that’s not quite the right verb. But close enough.…
On May 13, the Eastman School of Music at the University of Rochester will launch the first of two MOOCs that will trace the history of rock music. Taught by John Covach, a professor of music theory, The History of Rock, Part One will revisit the 1950s and 1960s, the halcyon days of rock ’n’ roll, which gave us the music of Elvis Presley, Chuck Berry, Phil Spector, Bob Dylan, the Beatles, the Rolling Stones, Jimi Hendrix, Cream, and other bands. The course will focus on the music itself, the cultural context from which rock emerged, and how changes in the music business and music technology shaped this new musical form. The second course (scheduled to start on July 8) will move forward to the 1970, 80s and 90s and cover the music of Led Zeppelin, the Allman Brothers, Carole King, Bob Marley, the Sex Pistols, Donna Summer, Michael Jackson, Madonna, Prince, Metallica, Run-DMC, Nirvana, and other artists. Students who successfully complete the course will receive a “Statement of Accomplishment” signed by the instructor.
Like many positive terms, the phrase “big bang” originated as a pejorative. Fred Hoyle coined the term in 1949 as a way of deflating the concept of an expanding universe. It stuck, even after Edwin Hubble showed that 13.7 billion years ago, all of the matter in our massive universe was indeed compacted into “one superdense ball.” Astronomers have also figured out that the volume of the big bang was only 120 decibels, about the loudness of your average rock show (though how there might have been sound without an atmosphere escapes me). There is some irony in Hoyle’s dig: the “big bang” wasn’t particularly big, and wasn’t much of a bang, but it happened.
The university describes the new astronomy series like this:
Ever wondered where the Universe came from? Or more importantly, where it’s headed? Voiced by David Mitchell, this series of twelve 60 second animations examines different scientific concepts from the big bang to relativity, from black holes to dark matter. The series also explores the possibility of life beyond Earth and considers why David Bowie is still none the wiser about life on Mars.
Spend a few extra minutes educating yourself with some more 60 second astronomy adventures below, or visit the complete collection here on YouTube or iTunesU.
Espresso is his palette. Coffee is his medium. Welcome to the artistic world of Mike Breach, a NYC barista, who painstakingly “paints” portraits on lattes and cappuccinos. After you visit Breach’s tumblr filled with “BaristArt,” you’ll never be quite so impressed by that heart-shaped design other baristas pour onto your expensive foam.
Apotheosis of cyberpunk culture, 1999’s The Matrix and its less-successful sequels introduced a generation of fanboys and girls to the most stylish expression of some age-old idealist thought experiments: the Hindu concept of Maya, Plato’s cave, Descartes’ evil demon, Hilary Putnam’s Brain in a Vat—all notions about the nature of reality that ask whether what we experience isn’t instead an elaborate illusion, concealing a “real” world outside of our perceptual grasp. In some versions—such as those of certain Buddhists and Christian Gnostics, whose ideas The Matrix directors borrowed liberally—one can awaken from the dream. In others, such as Kant’s or Jacques Lacan’s, that prospect is unlikely, if impossible. These questions about the nature of reality versus appearance are mainstays of intro philosophy courses and stereotypical stoner sessions. But they’re also perennially relevant to philosophers and neuroscientists, which is why such academic luminaries as Daniel Dennett and David Chalmers continue to address them in their work on the nature and problem of consciousness.
Dennett, Chalmers, the always captivating scholar/theologian/activist Cornel West, and a host of other academic thinkers, appear in the documentary above, Philosophy and the Matrix: Return to the Source. Part of the sprawling box-set The Ultimate Matrix Collection, the film comments on how The Matrix does much more than dramatize an undergraduate thesis; it takes on questions about religious revelation and authority, parapsychology, free will and determinism, and the nature of personal identity in ways that no dry philosophical text or arcane mystical system has before, thanks to its hip veneer and pioneering use of CGI. While some of the thinkers above might see more profundity than the movies seem to warrant, it’s still interesting to note how each film glosses the great metaphysical questions that intrigue us precisely because the answers seem forever out of reach.
Recently we posted a remarkable pair of videos featuring Bob Dylan and Van Morrison singing together on a hilltop in Athens. Today we’re back with another rare duet from the legendary singer-songwriters, this one recorded nine years after the jam session in Greece.
The performance took place on June 24, 1998 at the National Exhibition Centre in Birmingham, England. Dylan was on a world tour to support his Time Out of Mind album, which was released the previous fall. Morrison shared the bill at some parts of the tour, including shows in North America, Northern Ireland, England, Scotland and France. Morrison usually opened for Dylan, but on at least two occasions Morrison closed the show: in his native Belfast, and in Birmingham.
Near the end of Dylan’s Birmingham set, the audience was surprised when Morrison walked onstage in his sunglasses and pork pie hat. The two sang a duet of “Knockin’ on Heaven’s Door,” with Dylan playing acoustic guitar and Morrison the harmonica. It was a rare event: With only a couple of brief exceptions earlier in the tour, the two superstars kept their appearances separate. Fortunately, someone with a video camera was there to capture the moment.
On Tuesday, we gave you a Visualization of the Big Problem for MOOCs, which comes down to this: low completion rates. To be clear, the completion rates aren’t so much a problem for you; they’re more a problem for the MOOC providers and their business models. But let’s not get bogged down in that. We ended our post by asking you to share your own experience with MOOCs — particularly, to tell us why you started and stopped a MOOC. We got close to 50 thoughtful responses. And below we’ve summarized the 10 most commonly-cited reasons. Here they are:
1.) Takes Too Much Time: Sometimes you enroll in a MOOC, only to discover that it takes way too much time. “Just didn’t have time to do all the work.” “As a full-time working adult, I found it exceedingly difficult to watch hours upon hours of video lectures.” That’s a refrain we heard again and again.
2.) Assumes Too Much Knowledge: Other times you enroll in a MOOC, only to find that it requires too much base knowledge, like a knowledge of advanced mathematics. That makes the course an instant non-starter. So you opt out. Simple as that.
3.) Too Basic, Not Really at the Level of Stanford, Oxford and MIT: On the flip side, some say that their MOOCs weren’t really operating on a serious university level. The coursework was too easy, the workload and assignments weren’t high enough. A literature course felt more like a glorified book club. In short, the courses weren’t the real university deal.
4.) Lecture Fatigue: MOOCs often rely on formal video lectures, which, for many of you, is an“obsolete and inefficient format.” And they’re just sometimes boring. MOOCs would be better served if they relied more heavily on interactive forms of pedagogy. Val put it well when she said, “We should not try to bring a brick and mortar lecture to your living room. Use the resources available and make the learning engaging with shorter segments.… The goal should be to teach and teach better. If one of these online universities can figure that out, then the money will follow.”
5.) Poor Course Design: You signed up for a MOOC and didn’t know how to get going. One student related his experience: “From day one I had no idea what I was supposed to do. There were instructions all over the place. Groups to join with phantom members that never commented or interacted, and a syllabus that was being revised as the course went through it’s first week.”
6.) Clunky Community/Communication Tools: This has been the Achilles’ heel of online learning for years, and so far the MOOCs haven’t quite figured it out. It’s not unusual to hear this kind of comment from students: “I find that the discussion forums aren’t very useful or engaging. They are not a very good substitute for active in-class discussion.”
7.) Bad Peer Review & Trolls: Because MOOCs are so big, you often don’t get feedback from the professor. Instead you get it from algorithms and peers. And sometimes the peers can be less than constructive. One reader writes: “I chose to stop doing the peer response section of the class due to some students being treated rudely [by other students]; in fact, the entire peer response section of the class is done in a way I would NEVER have asked of students in a classroom.… [T]here is no involvement of the professor or TA’s in monitoring the TORRENT of complaints about peer reviews.”
8.) Surprised by Hidden Costs: Sometimes you discover that free MOOCs aren’t exactly free. They have hidden costs. Brooke dropped her MOOC when she realized that the readings were from the professor’s expensive textbook.
9.) You’re Just Shopping Around: You shop for courses, which involves registering for many courses, keeping some, and dropping others. That inflates the low completion rate, but it gives you freedom. As one reader said, “I am very, very happy about being able to be so picky.”
10.) You’re There to Learn, Not for the Credential at the End: Sometimes you do everything (watch the videos, do the readings, etc.) but take the final exam. In a certain way, you’re auditing, which suits many of you just fine. It’s precisely what you want to do. But that, too, makes the low completion rates look worse than they maybe are.
Thanks to everyone who took the time to participate. We really appreciate it! And if you’re looking for a new MOOC, don’t miss our list, 300 Free MOOCs from Great Universities (Many Offering Certificates).
“It’s not what a movie is about, it’s how it is about it.” Words of writerly wisdom from the late Roger Ebert, whom several generations of Americans came to recognize not just as a film critic, but as the very personification of film criticism. He earned this place in the country’s zeitgeist by mastering two starkly disparate types of media: the medium-length but always substantial review written for newspapers, and the short conversational review broadcast on television. The former we read in the form of his syndicated film pieces for the Chicago Sun-Times; the latter we watched on Siskel and Ebert at the Movies. After his co-host Gene Siskel’s passing in 1999, Ebert continued with Roger Ebert and the Movies, followed by Ebert and Roeper and the Movies. But longtime fans of his film criticism on television, and new fans discovering the show’s old episodes on the internet, will always look back to Ebert’s on-air debates — which sometimes devolved, simply, into fights — as the peak of the form, at least in terms of entertainment value. Above you’ll find a classic example in Siskel and Ebert’s tiff over the firefight in Stanley Kubrick’s Full Metal Jacket. “I have never felt a kill in a movie quite like that,” insists Siskel. “Not in Apocalypse Now? Not in The Deer Hunter? Not in Platoon?” Ebert asks before his riposte: “In that case, you’re going to love the late show, because they have kills like that every night in black and white starring John Wayne.” (BTW, we have a collection of John Wayne films here.)
Ebert knew how to deliver that metaphorical punch (and, when necessarily, to approach the edge of actual fisticuffs) on television. In print, he knew how to remain curious and thoughtful even when served each week’s heaping helping of studio mediocrity. This milder, more complicated, vastly knowledgeable critical persona comes through in his 1996 conversation with Charlie Rose (part one, part two) just above. Though he could celebrate and dismiss with the utmost conviction, he also understood that the film critic has higher duties than evaluation. He demonstrates this understanding all throughout his review archive, which, embracing the web before most critics of his generation, he’d put online by the mid-nineties. Back then, I spent an hour or two every day after school in the library, plowing through his back pages. I thought I was learning about the movies, as indeed I was, and I was certainly learning a thing or two about reviewing the movies, but I was above all learning about the whole craft of writing, and thus about approaching the world, cinematic and otherwise. We won’t remember Roger Ebert for the stars he doled out and withheld, nor for the angle of his thumbs; we’ll remember him for his ability to, through the lens of the movies, consider life itself.
They toyed with the idea of a donkey, but they went with four sheep instead, and now four ewes are mowing the grounds of Paris’ Municipal Archives. It’s all part of a pilot program where, if successful, sheep will trim the grass of Parisian public spaces and burn no fossil fuels along the way. The New York Times has more on this old school solution to a modern environmental problem.
Film critic Roger Ebert, like Pauline Kael before him, leaves behind a great torrent of words. Those of us accustomed to seeking out his opinion can comfort ourselves on the Internet, where his thoughts on the great (and not-so-great) films of the last four decades live in perpetuity.
After a ruptured carotid artery robbed him of the power of speech, words assumed an even greater importance for Ebert. Even though he felt lucky to be alive in an age when most home computers come equipped with a text-to-speech option, he mourned the loss of inflection, timing, and spontaneity. CereProc, a Scottish firm specializing in personalized computer voices, created a custom version he breezily referred to as Roger Junior or Roger 2.0, a Frankenstein’s monster assembled from hours of television appearances. A noble, but flawed attempt. Despite his Midwestern attraction to Apple’s computerized British accent, Ebert returned to its American male voice, “Alex”, as the most expressive option.
In 2011, the speechless Ebert gave a TED Talk on the subject. “Alex” was given his moment to shine, but there’s no way the technological miracle can compete with the human spectacle onstage.
Rather than rely on the relatively autonomous voice substitute, Ebert arranged for his wife, Chaz Hammelsmith, and friends Dean Ornish and John Hunter, to read his words from prepared scripts.
Forget W.C. Fields’ caveat about performing with children and dogs. Ebert stole his own show, shamelessly upstaging his loved ones with jolly pantomimed thumbs ups and other antics. When he’s on camera, you can’t take your eyes off him…as he clearly knew. A 2010 Esquire article by Chicago-based theater critic, Chris Jones, described how the removal of Ebert’s lower jaw gave him the aspect of a permanent smile. The disfigurement was shocking, but especially so on one whose face was so familiar. It led to frequent misassumptions that he had been mentally incapacitated as well. Hammelsmith’s tears when she gets to this part of her husband’s eloquent TED Talk speak volumes as well. His willingness to place himself front and center, where people who might think it impolite to stare could not help but see and hear him as a whole person, was a revolutionary act.
“Without intelligence and memory, there is no history.” — Roger Ebert, 1942 — 2013
To eat bacon sandwiches? Or not to eat bacon sandwiches? That’s a question tackled by David Spiegelhalter, who holds the title, “Winton Professor for the Public Understanding of Risk” at Cambridge University. Sometimes they just call him “Professor Risk” for short.
In his academic work, Spiegelhalter looks at risk and uncertainty every day, seeing how they affect the lives of individuals and society. You’d figure that this might make him more cautious than the rest of us. But that’s not how it turns out. After analyzing all of the data, Spiegelhalter comes to this conclusion: some calculated risks are worth it. They have minimal downside and make life worth living. Or, looked at a little differently, sometimes “one of the biggest risks [in life] is being too cautious.”
If you would like to sign up for Open Culture’s free email newsletter, please find it here. It’s a great way to see our new posts, all bundled in one email, each day.
If you would like to support the mission of Open Culture, consider making a donation to our site. It’s hard to rely 100% on ads, and your contributions will help us continue providing the best free cultural and educational materials to learners everywhere. You can contribute through PayPal, Patreon, and Venmo (@openculture). Thanks!
Popular Scienceis the fifth oldest continuously-published monthly magazine—a long way of saying that the magazine has done a fine job of maintaining a niche in a crazily fast-paced industry. Founded in 1872 by science writer Edward Youmans to reach an audience of educated laypeople, Popular Science today combines reviews of the latest gadgets with stories about innovation in design and science. It’s an organized mishmash of news about “the future now,” liberally defined. A recent issue included stories about the military’s use of 3‑D printing and an astrophysicist who questions whether Shakespeare wrote the entire Folio.
With that kind of breadth, the magazine’s archives cover just about everything. And it’s easy to browse through back issues, dating all the way back to 1872, since the magazine teamed up with Google to put a searchable archive on the web. The earliest issues, like this one from February 1920, feature color covers that bring to mind science fiction with a fascination for the imagined future.
One of the cool things about the magazine is its equal attention to new and old technology. Search for “scissors” and you will find this 1964 article about the mechanics of sharpening your own scissors. The archive also offers another search tool that returns results in a visual word frequency grid, which is especially cool if you click the “animate” button. Any social historians out there able to explain why the word “scissor” would appear so often in the mid-20th century?
Interestingly, although the word “internet” dates back to the 1960s, the word didn’t appear in the magazine’s pages until 1989.
Period advertisements are included, which adds to the fun. This issue from September, 1944 includes a house-advertisement on the table of contents page calling for all collectors of back issues to consider surrendering them for the war cause. “There’s a war going on and this is no time for sentiment,” the ad urges. “Grit your teeth and dig out those stacks of back numbers. Then turn them over to your local paper salvage drive!” Enter the archive here.
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.